Compare commits

...

325 Commits
1.0.1 ... 1.0.2

Author SHA1 Message Date
Ryan Petrello
05bec924e4 Merge pull request #795 from ryanpetrello/move-deprecated-stdout
move legacy UnifiedJob stdout data to a separate unmanaged model
2017-12-13 09:35:58 -05:00
Jake McDermott
40d1f2671f Merge pull request #811 from AlanCoding/i_wont_be_ignored
Fix bug creating inventory source schedules
2017-12-12 18:32:54 -05:00
Ryan Petrello
202161f090 move legacy UnifiedJob stdout data to a separate unmanaged model
This data often (in the case of inventory updates) represents large data
blobs (5+MB per job run).  Storing it on the polymorphic base class
table, `main_unifiedjob`, causes it to be automatically fetched on every
query (and every polymorphic join) against that table, which can result
in _very_ poor performance for awx across the board.  Django offers
`defer()`, but it's quite complicated to sprinkle this everywhere (and
easy to get wrong/introduce side effects related to our RBAC and usage
of polymorphism).

This change moves the field definition to a separate unmanaged model
(which references the same underlying `main_unifiedjob` table) and adds
a proxy for fetching the data as needed

see https://github.com/ansible/awx/issues/200
2017-12-12 18:16:19 -05:00
AlanCoding
7243f871b4 fix bug creating inventory source schedules 2017-12-12 17:49:51 -05:00
Jared Tabor
2c64a2ce63 Merge pull request #783 from jaredevantabor/notification-toggle
fixing url used for determining which notification to toggle on/off
2017-12-12 11:31:48 -08:00
Jared Tabor
86eb0353c5 fixing url used for determining which notification to toggle on/off
for #711
2017-12-12 11:07:21 -08:00
Matthew Jones
282290e151 Fix an issue referencing postgres port from openshift deployment 2017-12-12 10:52:02 -05:00
Alan Rominger
8d348f916b Merge pull request #794 from AlanCoding/hide_config_pass
Hide survey passwords in saved launch configs
2017-12-12 10:37:41 -05:00
AlanCoding
659d31324d hide survey passwords in saved launch configs 2017-12-12 09:35:46 -05:00
Ryan Petrello
1bc2d83403 Merge pull request #789 from ryanpetrello/multivault-acceptance
add some more tests and acceptance docs to wrap up multivault support
2017-12-11 20:35:27 -05:00
Ryan Petrello
8c90d36290 add some more tests and acceptance docs to wrap up multivault support
see: https://github.com/ansible/awx/issues/352
2017-12-11 16:56:02 -05:00
Bill Nottingham
9be438a60a Merge pull request #788 from wenottingham/no-country-for-old-python
Don't bother checking for python 2.6 in the venv
2017-12-11 14:57:25 -05:00
Bill Nottingham
c62430c282 Drop python2.6 checks. 2017-12-11 13:59:00 -05:00
Alan Rominger
1be3c77ac6 Merge pull request #787 from AlanCoding/workflow_system_jobs
allow using SystemJobTemplates in workflows
2017-12-11 13:21:33 -05:00
AlanCoding
4adfb9804e allow using SystemJobTemplates in workflows 2017-12-11 08:58:45 -05:00
Alan Rominger
64ac1ee238 Merge pull request #746 from AlanCoding/i_forgot
Intentionally forget start_args when job is done
2017-12-11 08:08:27 -05:00
AlanCoding
0bf06479d5 add migration to remove old start_args 2017-12-10 12:08:59 -05:00
AlanCoding
1f8cab4171 intentionally forget start_args when job is done 2017-12-10 12:08:54 -05:00
Jake McDermott
526bcc4a68 Merge pull request #785 from jakemcdermott/stored-xss-test-update
fix lint error and stabilize stored xss test case
2017-12-10 12:07:46 -05:00
Jake McDermott
9dcdf20fb0 stabilize template form stored xss test case 2017-12-10 11:41:41 -05:00
Jake McDermott
be0f66fd94 fix linting error in stored xss test 2017-12-10 11:39:13 -05:00
Alan Rominger
2135291f35 Merge pull request #740 from AlanCoding/configs_rebased5
Feature: saved launchtime configurations
2017-12-08 16:55:00 -05:00
AlanCoding
a9aae91634 generalize schedule prompts validation
This makes ScheduleSerializer behave same as WFJT nodes
Prevents providing job_type for workflow jobs, as example
2017-12-08 16:23:56 -05:00
AlanCoding
905ff7dad7 fix bugs where ask_ var was checked on node 2017-12-08 13:57:33 -05:00
AlanCoding
e59a724efa fix bug that broke combining WFJT and node vars 2017-12-08 13:48:45 -05:00
AlanCoding
1c8217936d Bug fixes from integration ran on launchtime branch
Make error message for muti-vault validation more
consistent with historical message
2017-12-08 13:46:38 -05:00
AlanCoding
72a8854c27 Make ask_mapping a simple class property
from PR feedback of saved launchtime configurations
2017-12-08 13:45:23 -05:00
AlanCoding
98df442ced combine launch config and multi-cred migrations 2017-12-08 13:45:21 -05:00
AlanCoding
5ada021a6e Tweak validation to allow multiple vault credentials
support providing vault passwords based on id
include needed passwords in launch serializer defaults
2017-12-08 13:43:43 -05:00
AlanCoding
34a8e0a9b6 Feature: saved launchtime configurations
Consolidate prompts accept/reject logic in unified models
Break out accept/reject logic for variables
Surface new promptable fields on WFJT nodes, schedules

Make schedules and workflows accurately reject variables
  that are not allowed by the prompting
  rules or the survey rules on the template

Validate against unallowed extra_data in system job schedules
Prevent schedule or WFJT node POST/PATCH with unprompted data
Move system job days validation to new mechanism
Add new psuedo-field for WFJT node credential
Add validation for node related credentials
Add related config model to unified job
Use JobLaunchConfig model for launch RBAC check

Support credential overwrite behavior with multi-creds
  change modern manual launch to use merge behavior
Refactor JobLaunchSerializer, self.instance=None
Modularize job launch view to create "modern" data
Auto-create config object with every job
Add create schedule endpoint for jobs
2017-12-08 13:38:54 -05:00
Marliana Lara
cd8a4b4669 Merge pull request #645 from marshmalien/feature/retry_failed_hosts
Feature - Retry failed hosts
2017-12-07 12:54:11 -05:00
Matthew Jones
7fc896e183 Merge pull request #774 from matburt/jupyter_for_devel
Adding jupyter notebook support to the AWX development environment
2017-12-06 09:49:07 -05:00
Matthew Jones
da0b686369 Adding jupyter notebook support to the AWX development environment
* Jupyter starts alongside the other awx services and is available on
  0.0.0.0:8888
* make target: make jupyter
* default settings in settings/development.py
* Added jupyter, matplotlib, numpy to dev dependencies
2017-12-05 23:46:18 -05:00
Matthew Jones
9488105381 Merge pull request #773 from shanemcd/devel
Add m2r to setup requirements file
2017-12-05 15:38:31 -05:00
Shane McDonald
ec14ae1930 Add m2r to setup requirements file
We `pip download` this file for offline installs. Automat lists this package as a setup_requires, but `pip download` doesn’t resolve these dependencies (distutils will attempt to install them via easy_install when setup.py is invoked).
2017-12-05 15:26:56 -05:00
Greg Considine
e1e225d6a0 Merge pull request #771 from gconsidine/ui/fix/input-replace-revert
Ui/fix/input replace revert
2017-12-05 13:43:32 -05:00
gconsidine
3ad174b15b Add e2e test case to verify revert/replace 2017-12-05 12:25:48 -05:00
gconsidine
b5644ed65b Fix replace/revert functionality on secret input fields 2017-12-05 10:39:15 -05:00
Jake McDermott
13d84b8d35 Merge pull request #768 from tchia04/fix_typo_credential_types
Fix typo: Failed to get credential tpyes
2017-12-04 22:26:52 -05:00
Tony Chia
9275b024de Update main.js
Changed "credential tpyes" to "credential types"
2017-12-04 16:27:46 -08:00
Jared Tabor
4f8d4994cf Merge pull request #765 from jaredevantabor/fix-764
Update error handling on host service after angular upgrade
2017-12-04 14:28:41 -08:00
Jared Tabor
a3144ee234 Update error handling on host service after angular upgrade 2017-12-04 13:52:34 -08:00
Alan Rominger
7fe22e9c53 Merge pull request #757 from AlanCoding/vault_cred_noop
allow no-op case for vault_credential
2017-12-04 16:01:00 -05:00
Alan Rominger
42d8368596 Merge pull request #763 from AlanCoding/remember_where_you_came_from
add AWX meta extra_vars for workflow + schedule
2017-12-04 15:52:30 -05:00
AlanCoding
eecf997856 add AWX meta extra_vars: WFJT + schedule 2017-12-04 15:33:05 -05:00
Matthew Jones
21bdea05a0 Merge pull request #762 from matburt/fix_pg_port
Make sure we define postgres port customization during install
2017-12-04 14:16:09 -05:00
Matthew Jones
a3071c2a1f Make sure we define postgres port customization during install 2017-12-04 11:08:40 -05:00
Matthew Jones
cf0cc2e2f2 Add system requirements to install docs 2017-12-04 07:56:34 -05:00
Ryan Petrello
e7918ad637 Merge pull request #752 from ryanpetrello/multivault
support specifying multiple vault IDs for a playbook run
2017-12-01 11:43:39 -05:00
AlanCoding
dfc154ed95 allow no-op case for vault_credential 2017-12-01 10:29:23 -05:00
Ryan Petrello
a1f8f65add support specifying multiple vault IDs for a playbook run
see: https://github.com/ansible/awx/issues/352
2017-11-30 16:55:17 -05:00
Alan Rominger
fde5a8850d Merge pull request #748 from AlanCoding/no_job_in_relaunch
Do not show job serialization in relaunch GET
2017-11-30 09:58:35 -05:00
AlanCoding
c359c072c4 do not show job serialization in relaunch GET 2017-11-30 08:47:35 -05:00
Jake McDermott
ee0aa40542 Merge pull request #743 from jakemcdermott/gcp-service-file
fix submit when no input object defined
2017-11-29 20:50:24 -05:00
Jake McDermott
81f2184aa7 fix submit when no input object defined 2017-11-29 19:50:46 -05:00
Jake McDermott
96c66b1e20 Merge pull request #712 from jakemcdermott/gcp-service-file
add input field for gcp service account json file
2017-11-29 18:41:23 -05:00
Jake McDermott
dbb9ffbaf4 use settings when setting up user data 2017-11-29 18:27:46 -05:00
Jake McDermott
06a7c024fe add e2e test for gcp service account file input 2017-11-29 18:27:34 -05:00
Jake McDermott
1229a10f35 add gcp service account file input 2017-11-29 18:27:24 -05:00
Jake McDermott
f15b1ae549 disable textarea drag and drop when field is disabled 2017-11-29 18:27:12 -05:00
Jake McDermott
71fea2e360 allow for programmatic input to text and textarea-secret fields 2017-11-29 18:26:59 -05:00
Jake McDermott
5baa371739 add unit test for file input component 2017-11-29 18:26:47 -05:00
Jake McDermott
cc8b5bc808 add file input component 2017-11-29 18:26:36 -05:00
Alan Rominger
53c6248a6d Merge pull request #647 from AlanCoding/no_sql
remove raw SQL in visible_roles
2017-11-29 16:46:09 -05:00
AlanCoding
c4bc310271 remove raw SQL in visible_roles 2017-11-29 16:04:31 -05:00
Chris Meyers
1899795d08 Merge pull request #721 from chrismeyersfsu/feature-2_factor
allow support for saml + 2-factor
2017-11-29 14:54:57 -05:00
Alan Rominger
43c58b5bf5 Merge pull request #731 from AlanCoding/enabled_fix
fix inventory import bug with enabled_var
2017-11-29 14:46:16 -05:00
AlanCoding
2c06bfc9ce fix inventory import bug with enabled_var 2017-11-29 14:12:03 -05:00
Alan Rominger
5602b5d2d7 Merge pull request #733 from AlanCoding/credentials_in_list
Show credentials in list view
2017-11-29 12:09:52 -05:00
Chris Meyers
032318494b added tests for new settings field type 2017-11-29 11:52:00 -05:00
Bill Nottingham
40c22dcec8 Merge pull request #643 from wenottingham/whitespace-the-final-frontier
Fix extra whitespace in callback URL.
2017-11-29 11:29:45 -05:00
Alan Rominger
04f682bf7a Merge pull request #694 from AlanCoding/credentials_not_a_thing
adjust assertions about JT credentials to be correct
2017-11-29 10:41:48 -05:00
Alan Rominger
070a12a10c Merge pull request #692 from AlanCoding/vault_credential_check
Modify JT access tests to reflect new vault_credential reality
2017-11-29 10:40:35 -05:00
Wayne Witzel III
53460db4d7 Merge pull request #736 from ewjoachim/fix-social-core
Fix import of social_core.exceptions in sso/pipeline.py
2017-11-29 09:26:43 -05:00
AlanCoding
37a44c439e show credentials in list view 2017-11-29 08:14:45 -05:00
Joachim Jablon
6609f38fa2 Fix import of social_core.exceptions in sso/pipeline.py
Signed-off-by: Joachim Jablon <ewjoachim@gmail.com>
2017-11-29 14:08:58 +01:00
Alan Rominger
8f5be46d52 Merge pull request #730 from AlanCoding/list_bug
fix bug with inventory update queryset
2017-11-28 18:41:39 -05:00
Bill Nottingham
3866dcaaae Merge pull request #732 from wenottingham/quote-fu
Remove stray quote from help string.
2017-11-28 15:38:42 -05:00
Bill Nottingham
8cede51bac Remove stray quote from help string. 2017-11-28 14:32:39 -05:00
AlanCoding
a880f47925 fix bug with inventory update queryset 2017-11-28 14:13:35 -05:00
Chris Meyers
383c3cfe3e add more saml fields 2017-11-28 13:49:35 -05:00
Shane McDonald
32fcb84cf6 Merge pull request #722 from jakemcdermott/tools-awx-pycrypto
add pycrypto distro package to tools awx container image
2017-11-28 13:42:02 -05:00
Jake McDermott
34195a1b35 add pycrypto distro package to tools awx container image 2017-11-28 12:45:53 -05:00
Greg Considine
c723ba5289 Merge pull request #717 from gconsidine/ui/bump-dependency-versions
Update dependency versions to pull in latest 1.x Angular version
2017-11-28 10:11:12 -05:00
gconsidine
3ff9fa9931 Remove test on async fn with no callback and no returned promise 2017-11-28 09:54:54 -05:00
Wayne Witzel III
3202e77b57 Merge pull request #720 from wwitzel3/devel
Update to asgi_amqp 1.0.1
2017-11-27 14:57:39 -05:00
Wayne Witzel III
a858093db8 Update to asgi_amqp 1.0.1 2017-11-27 19:41:30 +00:00
gconsidine
86a559caef Update dependency versions to pull in latest 1.x Angular version 2017-11-27 11:12:45 -05:00
Michael Abashian
33ff10728d Merge pull request #680 from mabashian/delete-warning-text
Tweaked language on delete warning modal
2017-11-22 15:02:55 -05:00
mabashian
ff1f322c88 Removed unused string from credentials strings 2017-11-22 13:29:52 -05:00
mabashian
d3da899459 Defined delete string in the base with the ability to pass the resource name in 2017-11-22 13:28:02 -05:00
AlanCoding
9fe524cd20 adjust assertions about JT credentials to be correct 2017-11-21 10:03:57 -05:00
AlanCoding
1481a62b23 modify JT access tests to reflect new vault_credential reality 2017-11-21 08:40:04 -05:00
Alan Rominger
ce6d96feda Merge pull request #687 from AlanCoding/new_kill
allow deletion of new jobs
2017-11-21 07:20:05 -05:00
AlanCoding
6c57a3bb68 allow deletion of new jobs 2017-11-20 11:19:22 -05:00
mabashian
565b0b82dd Tweaked language on delete warning modal 2017-11-17 12:45:01 -05:00
Chris Meyers
98f2d936d9 allow support for saml + 2-factor
* python-social-auth has SOCIAL_AUTH_SAML_SECURITY_CONFIG, which is
forwarded to python-saml settings configuration. This commit exposes
SOCIAL_AUTH_SAML_SECURITY_CONFIG to configure tower in tower to allow
users to set requestedAuthnContext, which will disable the requesting of
password type auth from the idp. Thus, it's up to the idp to choose
which auth to use (i.e. 2-factor).
2017-11-17 09:25:50 -05:00
Matthew Jones
9b5371f2ab Merge pull request #670 from chrismeyersfsu/job_events_docs
add docs for job events
2017-11-16 13:46:35 -05:00
Chris Meyers
c4e6fc23fc add docs for job events
* Focus on the ordering of Job Event creation. Important to know when
thinking through different Ansible execution strategies.
2017-11-16 12:48:48 -05:00
Alan Rominger
71127c039d Merge pull request #668 from AlanCoding/null_cred_okay2
Do not filter out JTs with null credentials
2017-11-16 11:18:16 -05:00
AlanCoding
127da5525c do not filter out JTs with null credentials 2017-11-16 10:20:41 -05:00
Alan Rominger
0f52ab47a0 Merge pull request #665 from AlanCoding/prefetch_credentials
prefetch UnifiedJob related credentials
2017-11-16 09:57:22 -05:00
Matthew Jones
b06a508ceb Merge pull request #651 from tumbl3w33d/646_configurable_search_domains
Make DNS search domain configurable for awx containers
2017-11-15 23:32:16 -05:00
AlanCoding
8cb5ce8307 prefetch UnifiedJob related credentials 2017-11-15 22:35:10 -05:00
Benjamin Wenzel
c1aa4129f9 Make DNS search domain configurable for awx containers
related #646
2017-11-15 21:11:56 +01:00
Matthew Jones
d6b10b7f44 Merge pull request #657 from ansible/openshift_fixes
Openshift fixes
2017-11-15 13:29:41 -05:00
Shane McDonald
e2aa9dc599 Merge pull request #658 from shanemcd/devel
Enable image stream lookups in AWX OpenShift Project
2017-11-15 13:21:51 -05:00
Shane McDonald
a043369d07 Enable image stream lookups in AWX OpenShift Project
See the OpenShift docs on this for more info: https://docs.openshift.com/container-platform/3.6/dev_guide/managing_images.html#using-is-with-k8s

If you are not using OpenShift’s internal registry you will need to manually set awx_task_openshift_image and awx_web_openshift_image.
2017-11-15 13:15:56 -05:00
Matthew Jones
03eca250d9 Fix an openshift issue writing the inventory file
Openshift was throwing an error here, though I'm not sure why it makes
a whole lot of difference to call fdopen() vs open(). This was
introduced when this method was unified under the new
ansible-inventory system. This fixes it for all cases. mkstemp(),
while not necessary, is a useful addition to keep from leaking
inventory details unnecessarily.
2017-11-15 13:12:54 -05:00
Matthew Jones
65d01d508b Fix an issue with handler tasks after celery upgrade
There's a bug in celery 4.X when using bound tasks as error
handlers. We don't actually need it to be bound especially since the
request object is now available in the function signature
2017-11-15 13:12:06 -05:00
Ryan Petrello
3a2ec25fb4 Merge pull request #649 from ryanpetrello/multicred
fix a permissions bug for credentials specified at JT launch time
2017-11-15 08:49:08 -05:00
Ryan Petrello
fa09d68603 add a few minor optimizations and some refactoring for multi-cred 2017-11-14 16:47:28 -05:00
Ryan Petrello
eb140d9e69 fix a permissions bug for credentials specified at JT launch time
hat tip to @alancoding for spotting this one
2017-11-14 16:21:05 -05:00
Jared Tabor
5852c16ba6 Merge pull request #613 from jaredevantabor/fix-569
removing codemirror instantiation from $transition event
2017-11-14 11:09:41 -08:00
Ryan Petrello
ebd8941439 Merge pull request #595 from ryanpetrello/multicred
replace all Job/JT relations with a single M2M credentials relation
2017-11-14 14:07:18 -05:00
Jared Tabor
32cb18fc85 removing codemirror instantiation from $transition event 2017-11-14 10:24:36 -08:00
Marliana Lara
aeb8eb3d1e Fix jshint errors 2017-11-14 13:23:05 -05:00
Marliana Lara
6654cc35f7 Add relaunch component to Completed Jobs list 2017-11-14 13:04:20 -05:00
Ryan Petrello
28ce9b700e replace all Job/JT relations with a single M2M credentials relation
Includes backwards compatibility for now-deprecated .credential,
.vault_credential, and .extra_credentials

This is a building block for multi-vault implementation and Alan's saved
launch configurations (both coming soon)

see: https://github.com/ansible/awx/issues/352
see: https://github.com/ansible/awx/issues/169
2017-11-14 12:49:12 -05:00
Bill Nottingham
0558bd82bb Fix extra whitespace in callback URL. 2017-11-14 12:03:52 -05:00
Ryan Petrello
f887aaa71f Merge pull request #637 from ryanpetrello/fix-django-settings-bug
undo an optimization in django.conf.settings that breaks awx settings
2017-11-14 11:56:24 -05:00
Marliana Lara
69ada03b7b Add relaunch component to Job Results panel 2017-11-14 11:51:53 -05:00
Marliana Lara
ee6beae50a Add relaunchButton component and styles 2017-11-14 11:41:46 -05:00
Matthew Jones
799feac0e1 Merge pull request #638 from shanemcd/devel
Fix OpenShift configmap
2017-11-14 11:40:12 -05:00
Shane McDonald
0d86678a44 Fix OpenShift configmap
These variables changed in 8faf588775
2017-11-14 11:32:05 -05:00
Ryan Petrello
38f893c124 undo an optimization in django.conf.settings that breaks awx settings 2017-11-14 11:03:50 -05:00
Greg Considine
a2b444f179 Merge pull request #625 from gconsidine/ui/fix/home-dashboard-popover
Revert versions of D3 used by awx and ansible/nvd3
2017-11-14 10:48:40 -05:00
Matthew Jones
f46bacdeaa Merge pull request #636 from ansible/fix_celery_inspector
Delay instantiation of the celery app for the inspector
2017-11-14 10:42:10 -05:00
Matthew Jones
9ee77a95c6 Delay instantiation of the celery app for the inspector
This keeps the instance from re-using a pool that might have already
expired and is unusable for the inspector that needs to run as part of
the task manager
2017-11-14 10:33:47 -05:00
Alan Rominger
93b80307db Merge pull request #624 from AlanCoding/dev_super
get development supervisor use working again
2017-11-14 10:14:11 -05:00
Alan Rominger
0a883edd4d Merge pull request #632 from AlanCoding/some_test_fixes
do not use expensive visible_roles for Activity Stream filter
2017-11-14 09:56:27 -05:00
gconsidine
4cd9556f7b Revert versions of D3 used by awx and ansible/nvd3 2017-11-14 09:47:25 -05:00
Wayne Witzel III
9ed2a0da8f Merge pull request #627 from wwitzel3/devel
Fix image_build
2017-11-14 08:53:21 -05:00
AlanCoding
7eac219eae do not use expensive visible_roles for Act Stream filter 2017-11-14 08:37:14 -05:00
AlanCoding
805170ffd7 get development supervisor use working again 2017-11-13 20:31:32 -05:00
Wayne Witzel III
d696f6c3f6 Fix image_build 2017-11-13 19:11:58 -05:00
Wayne Witzel III
3cdeb446c4 Merge pull request #622 from wwitzel3/devel
Using metavar with a flag is not allowed or useful
2017-11-13 17:12:03 -05:00
Wayne Witzel III
58737a8e28 Using metavar with a flag is not allowed or useful 2017-11-13 16:05:54 -05:00
Wayne Witzel III
2fb74f5b02 Merge pull request #621 from wwitzel3/devel
Fix mgmt cmds, use real types not strings
2017-11-13 15:38:19 -05:00
Wayne Witzel III
768a3f62f1 Fix mgmt cmds, use real types not strings 2017-11-13 15:32:31 -05:00
Bill Nottingham
b03a64dd53 Merge pull request #567 from wenottingham/the-source--not-just-a-magazine
Assorted updates to project_update.yml
2017-11-13 15:18:42 -05:00
Shane McDonald
386382c456 Merge pull request #619 from wwitzel3/devel
Fix installer references to asgi_amqp
2017-11-13 14:10:58 -05:00
Wayne Witzel III
d9f8f7721a Fix installer references to asgi_amqp 2017-11-13 13:39:39 -05:00
Wayne Witzel III
d2711f4af0 Merge pull request #618 from wwitzel3/devel
Silence models.E006 until rename the Project and InventorySource
2017-11-13 13:23:15 -05:00
Wayne Witzel III
77fd7ea4a8 Silence models.E006 until we can rename the Project and InventorySource models 2017-11-13 13:19:29 -05:00
Wayne Witzel III
faa5a5e024 Merge pull request #600 from wwitzel3/django111
Upgrade AWX major dependencies
2017-11-13 12:26:04 -05:00
Wayne Witzel III
798d27c2cb Fix task_manager test 2017-11-13 12:02:00 -05:00
Wayne Witzel III
5b4dc9e7ee Disable group sending in consumer (Issue ansible/awx#615) 2017-11-13 10:19:14 -05:00
Wayne Witzel III
f118e27047 Flake8 fixes and URL updates 2017-11-10 17:04:33 -05:00
Michael Abashian
2ab33467d8 Merge pull request #601 from mabashian/275-delete-warnings
More verbose delete warnings
2017-11-10 16:40:12 -05:00
mabashian
42a6757a10 Pass params in object to request function 2017-11-10 16:14:40 -05:00
gconsidine
aa38c1123c Check for null resource and update e2e model usage 2017-11-10 15:59:38 -05:00
gconsidine
5fcff09aae Update string-related component tests 2017-11-10 12:33:27 -05:00
mabashian
5e2ecda413 Define type in delete jt unit test 2017-11-10 11:55:41 -05:00
mabashian
25dc3f8778 Update delete modals and fixed unit test failures 2017-11-10 11:31:11 -05:00
Michael Abashian
2957f5bc7f Merge pull request #1 from gconsidine/275-delete-warnings
275 delete warnings
2017-11-10 11:28:33 -05:00
gconsidine
8713e38c44 Update the base model to use string service instead of each sub model 2017-11-10 10:42:05 -05:00
Wayne Witzel III
96904968d8 Fix migration issues, tests, and templates 2017-11-09 17:29:48 -05:00
Wayne Witzel III
6d6bbbb627 Update URL strucuture, fixed string based calls 2017-11-09 17:24:04 -05:00
Wayne Witzel III
14c5123fda Update celery environ and tasks 2017-11-09 17:21:19 -05:00
Wayne Witzel III
de376292ba Update management commands 2017-11-09 17:18:18 -05:00
Wayne Witzel III
8faf588775 Update package versions, settings, and tooling 2017-11-09 17:17:30 -05:00
gconsidine
e8fd40ace0 Update model request interface and references 2017-11-09 17:01:32 -05:00
Bill Nottingham
a2b18a9f6e Add test to short-circuit checkout if revision is already checked out.
Move role checkout to a separate play, to work with this.
2017-11-08 18:29:59 -05:00
Jake McDermott
8f6289707b Merge pull request #596 from jakemcdermott/stored-xss-test
add test suite for stored xss
2017-11-08 16:47:03 -05:00
mabashian
4cd2f93c31 Updated delete warnings to indicate resources that may be invalidated as a result of deletion 2017-11-08 16:38:34 -05:00
Jake McDermott
79f450df8e add stored xss test suite 2017-11-07 13:43:20 -05:00
Jake McDermott
aab66b8ce8 add namespacing, schedules, and sources to fixtures 2017-11-07 13:43:05 -05:00
Jake McDermott
0afe94c4d4 add navigateTo command 2017-11-07 13:42:41 -05:00
Chris Church
6c1919273b Merge pull request #551 from cchurch/🥓
Include JSON string in temporary inventory script with %r
2017-11-06 17:46:12 -05:00
Alan Rominger
1ed3a8f0e9 Merge pull request #566 from AlanCoding/no_can_read
Fix bug where system gets 404 viewing job detail view
2017-11-06 15:23:30 -05:00
Michael Abashian
7dc30ab866 Merge pull request #554 from mabashian/274-right-click-new-tab
Fixed most lists so that name links can be opened in a new tab
2017-11-06 15:18:11 -05:00
mabashian
8f82fc26a2 Removed commented line from schedule list config 2017-11-06 14:46:31 -05:00
Matthew Jones
74c9b9cf6a Adding pycrypto distro package
Without this a lot of things break and it's no longer marked as a
dependency for the ansible core project
2017-11-06 11:18:45 -05:00
Bill Nottingham
632ff959ff Merge pull request #573 from wenottingham/going-up-for-some-headers
Preformatted text doesn't actually work in our popovers; don't try to use it.
2017-11-03 15:23:05 -04:00
Bill Nottingham
19d093f7aa Preformatted text doesn't actually work in our popovers; don't try to use it. 2017-11-03 13:25:58 -04:00
AlanCoding
270a41443c fix bug of system auditor 404 viewing job 2017-11-03 08:20:15 -04:00
Jake McDermott
8666512d99 Merge pull request #550 from jakemcdermott/run_both_unit_test_suites
run both ui unit test suites and linting tasks, collect results for shippable
2017-11-02 15:40:30 -04:00
Bill Nottingham
c827e73dac Update comments and task names. 2017-11-02 14:11:48 -04:00
Matthew Jones
b70f7bd866 Merge pull request #549 from cchurch/allow-non-fqdn-for-ldap-server-uri
Allow non-FQDN for AUTH_LDAP_SERVER_URI.
2017-11-02 08:57:04 -04:00
Jake McDermott
77e11fe8fe collect unit test results for shippable
Signed-off-by: Jake McDermott <jmcdermott@ansible.com>
2017-11-02 01:22:07 -04:00
Jake McDermott
93f35b037d remove unused config 2017-11-01 15:31:00 -04:00
mabashian
d056cb22ef Fixed most lists so that name links can be opened in a new tab 2017-11-01 14:26:48 -04:00
Jake McDermott
4883876dc5 run both unit test suites and linting tasks 2017-11-01 13:36:34 -04:00
Chris Church
863b5e2e8e Output repr() of JSON in temporary inventory script to prevent Python from devouring escape sequences. 2017-11-01 12:59:49 -04:00
Chris Church
0f8e073d10 Allow non-FQDN for AUTH_LDAP_SERVER_URI. 2017-11-01 12:51:41 -04:00
Alan Rominger
0579db1162 Merge pull request #439 from AlanCoding/retry_subset
Feature: retry on subset of jobs hosts
2017-11-01 11:33:15 -04:00
Bill Nottingham
7f20118d48 Merge pull request #547 from wenottingham/its-time-for-an-audit
Add system auditor placeholder.
2017-11-01 11:32:15 -04:00
Bill Nottingham
89d0f90e27 Add system auditor placeholder. 2017-11-01 10:46:39 -04:00
AlanCoding
41c84b4652 update retry-on-failed acceptance docs
Relaunching by other status values is tabled for later.
2017-11-01 10:24:46 -04:00
AlanCoding
0ae9283fba Feature: retry on subset of jobs hosts 2017-11-01 10:22:52 -04:00
Matthew Jones
f1813c35ed Merge pull request #528 from AlanCoding/fix_dep_update
fix bug with dependent SCM inv updates
2017-11-01 09:03:11 -04:00
Matthew Jones
0c5978715e Merge pull request #523 from AlanCoding/wfjt_spec_fix
fix admin edit of WFJT survey spec
2017-11-01 09:02:36 -04:00
Matthew Jones
5c1a6b7d6d Merge pull request #535 from matburt/fix_pgdata_issue
Specify a PGDATA directory to prevent container re-create issues
2017-11-01 08:46:24 -04:00
Jim Ladd
84c439b774 Merge pull request #542 from jladdjr/awx_349_acceptance_doc
Update custom credential document for mult-file injection
2017-10-31 19:14:37 -04:00
Jim Ladd
655759a5fc Update custom credential document for mult-file injection 2017-10-31 16:34:03 -04:00
Jake McDermott
6c85902ce8 Merge pull request #541 from jakemcdermott/update-credentials-title-selector
fix credentials form title selector
2017-10-31 16:30:00 -04:00
Jake McDermott
ef0c2086eb fix credentials form title selector when running container chrome 2017-10-31 16:11:44 -04:00
Matthew Jones
ffb148aaa9 Merge pull request #534 from dleehr/fix-install-2.3
Updates INSTALL.md to reflect Ansible 2.4 requirement
2017-10-31 15:07:06 -04:00
Matthew Jones
bf281f6ea9 Specify a PGDATA directory to prevent container re-create issues 2017-10-31 10:20:08 -04:00
Dan Leehr
641897713f Updates INSTALL.md to reflect Ansible 2.4 requirement 2017-10-30 22:41:58 -04:00
AlanCoding
d7ae95684c fix bug with dependent SCM inv updates
This change causes all SCM inventory updates to run a local
project sync unless they were specifically marked as a
dependency of an already-existing project update, as
opposed to just doing so on manual launch types.

This should be a more robust criteria.
2017-10-30 11:59:33 -04:00
AlanCoding
8b39b3b41a fix admin edit of WFJT survey spec 2017-10-29 16:27:16 -04:00
Ryan Petrello
0876d7825c Merge pull request #520 from ryanpetrello/phantom-version-comment
help people avoid mistakenly inputting their version info as a comment
2017-10-27 14:59:11 -04:00
Ryan Petrello
3953366a9e help people avoid mistakenly inputting their version info as a comment 2017-10-27 14:43:20 -04:00
Alan Rominger
d7f5ef6564 Merge pull request #511 from AlanCoding/wrong_type_error
raise error for invalid type lookup
2017-10-27 14:14:11 -04:00
Matthew Jones
63cf681369 Merge pull request #418 from Comradephate/patch-1
Divorce the "local docker install" portion of the install playbook from the image build + push logic
2017-10-27 12:31:09 -04:00
Jared Tabor
a0f1c8fc7c Merge pull request #499 from jaredevantabor/project-based-nav
Adding Project Based Navigation of Job Templates
2017-10-26 18:11:34 -07:00
Jared Tabor
4fbfddaa93 changes from PR feedback: removing ghost action icon
and fixing a bug where the list of job templates was improperly
updated when a job was running and live events were received.
2017-10-26 16:50:12 -07:00
Alan Rominger
bc7793def1 Merge pull request #494 from AlanCoding/get_queryset_modest_refactor
Refactor get_queryset inside of access.py
2017-10-26 13:52:08 -04:00
AlanCoding
4e16b19ae6 Refactor access.py get_queryset into filtering method
Use BaseAccess class to enforce the superuser and system
  auditor conditions, as well as the optimizations.
Declare optimizations on access class as tuple.
Limit role of access class method narrowly to RBAC filtering.
2017-10-26 11:40:08 -04:00
AlanCoding
b4a446dba0 raise error for invalid type lookup 2017-10-26 11:25:40 -04:00
Alan Rominger
641b18fe13 Merge pull request #509 from AlanCoding/lib_test_fixes
update tests to new Ansible core code
2017-10-26 09:28:52 -04:00
Bill Nottingham
c680327ec3 Merge pull request #506 from wenottingham/its-log
Remove accidentally committed log files
2017-10-26 09:17:30 -04:00
AlanCoding
e5d2eb9f3d update tests to new Ansible core code 2017-10-26 08:34:00 -04:00
Bill Nottingham
da25f4104c Update .gitignore for npm log files. 2017-10-25 21:37:29 -04:00
Bill Nottingham
871dc81da3 Avoid task duplication by using default(omit). 2017-10-25 21:30:46 -04:00
Bill Nottingham
1285e8ffef Add blocks around the different SCMs, for clarity purposes. 2017-10-25 16:58:41 -04:00
Bill Nottingham
a8947c3b96 Remove accidentally committed log files 2017-10-25 16:38:56 -04:00
Greg Considine
565d116955 Merge pull request #505 from gconsidine/ui/fix/multiple-dependency-include
Update dependencies that share Angular as a dependency
2017-10-25 16:07:29 -04:00
Jake McDermott
1fe9f43690 Merge pull request #502 from jakemcdermott/update_test_config
test config cleanup and tooling updates
2017-10-25 15:39:55 -04:00
gconsidine
4a522fd10f Update dependencies that share Angular as a dependency 2017-10-25 14:48:18 -04:00
Jared Tabor
5e349590fd making JOB TEMPLATES tab the last tab on the projects form 2017-10-25 11:21:36 -07:00
Jared Tabor
625c0ad578 Adding Project Based Navigation of Job Templates
This adds a Job Templates tab onto the Project form that gives
the user the ability to see all the job templates using a project.
Clicking the add button on this list will take the user to the job
template form with the project field auto-filled with the project.
2017-10-25 11:21:36 -07:00
Jake McDermott
3800a16f3e refactor e2e settings and config modules
This should make the settings and configuration logic less implicit and
a little easier to follow. Some familiarity with the configuration behavior
of nightwatch is still necessary in places - specifically, one should know
that all test_settings defined for non-default environments are treated as
overrides to the values defined for the default environment.
2017-10-25 10:58:39 -04:00
Jake McDermott
d70a0c8c24 cleanup e2e test development tooling and add readme examples 2017-10-25 10:22:18 -04:00
Aaron Tan
e999b35c42 Merge pull request #493 from jangsutsr/fix-474
Add protection against credential getattr
2017-10-25 09:45:25 -04:00
Aaron Tan
553e81f888 Add protection against credential getattr
Relates #474.

Add protection in `__getattr__` method to prevent possible infinite
recursion loop.

Signed-off-by: Aaron Tan <jangsutsr@gmail.com>
2017-10-24 12:08:41 -04:00
Alan Rominger
73ece87e68 Merge pull request #487 from AlanCoding/E722
flake8: comply with new E722 rule
2017-10-23 14:58:27 -04:00
AlanCoding
90f63774f4 flake8: comply with new E722 rule 2017-10-23 14:36:48 -04:00
Marliana Lara
17aecc17d2 Merge pull request #472 from marshmalien/angular_upgrade_1_6_6
Upgrade to AngularJS v1.6.6
2017-10-23 14:13:11 -04:00
Marliana Lara
9157f53d43 Update angular-scheduler and angular-tz-extensions versions 2017-10-23 12:29:28 -04:00
Marliana Lara
9bb696aa6e Fix for input directive using strict comparison to determined "checked" row 2017-10-23 12:29:27 -04:00
Marliana Lara
32da686724 Handle errors with ProcessErrors 2017-10-23 12:29:26 -04:00
Marliana Lara
cee81e9df6 Fix any unhandled rejections 2017-10-23 12:29:25 -04:00
Ben Thomasson
b544922da1 Fix incorrect JS syntax 2017-10-23 12:29:24 -04:00
Ben Thomasson
b3c2f35358 Change success to then manually 2017-10-23 12:29:23 -04:00
Ben Thomasson
7d767f8f63 Automatically change .error to .catch.
Use this script to change .error to .catch using this linux script:

    #!/bin/bash -ex
    #Run in awx/awx/ui/client/src
    FILES=`grep -l -R "\.error(\s*function\s*(data,\s*status)\s*{" . | xargs`
    sed -i "s/\.error(\s*function\s*(data,\s*status)\s*{/.catch(({data, status}) => {/g" $FILES

    FILES=`grep -l -R "\.error(this\.error\.bind(this))" . | xargs`
    sed -i "s/\.error(this\.error\.bind(this))/\.catch(this\.catch\.bind(this))/g" $FILES

    FILES=`grep -l -R "\.error(\s*function\s*(error)\s*{" . | xargs`
    sed -i "s/\.error(\s*function\s*(error)\s*{/.catch(({error}) => {/g" $FILES

    FILES=`grep -l -R "\.error(\s*function\s*(obj,\s*status)\s*{" . | xargs`
    sed -i "s/\.error(\s*function\s*(obj,\s*status)\s*{/.catch(({obj, status}) => {/g" $FILES

    FILES=`grep -l -R "\.error(\s*function\s*(res,\s*status)\s*{" . | xargs`
    sed -i "s/\.error(\s*function\s*(res,\s*status)\s*{/.catch(({res, status}) => {/g" $FILES

    FILES=`grep -l -R "\.error(\s*function\s*(msg,\s*code)\s*{" . | xargs`
    sed -i "s/\.error(\s*function\s*(msg,\s*code)\s*{/.catch(({msg, code}) => {/g" $FILES

    FILES=`grep -l -R "\.error(\s*function\s*()\s*{" . | xargs`
    sed -i "s/\.error(\s*function\s*()\s*{/.catch(() => {/g" $FILES
2017-10-23 12:29:22 -04:00
Ben Thomasson
834e6c692c Automatically change instances of .success to .then with this linux script:
#Run in awx/awx/ui/client/src

    #!/bin/bash -ex
    FILES=`grep -l -R "\.success(\s*function\s*(data)\s*{" . | xargs`
    sed -i "s/\.success(\s*function\s*(data)\s*{/\.then(({data}) => {/g" $FILES

    FILES=`grep -l -R "\.success(\s*function\s*()\s*{" . | xargs`
    sed -i "s/\.success(\s*function\s*()\s*{/\.then(() => {/g" $FILES

    FILES=`grep -l -R "\.success(this\.success\.bind(this))" . | xargs`
    sed -i "s/\.success(this\.success\.bind(this))/\.then(this\.then\.bind(this))/g" $FILES
2017-10-23 12:29:21 -04:00
Ben Thomasson
fabdab78ef Upgrade AngularJS to 1.6.6 2017-10-23 12:29:10 -04:00
Bill Nottingham
a313109b15 Merge pull request #475 from wenottingham/scm-tooltip
Update tooltip for update-on-launch.
2017-10-20 10:32:34 -04:00
Ryan Petrello
7b36630f47 Merge pull request #460 from ryanpetrello/cloudforms-cache-path
store cloudforms inventory cache files in the proper location on disk
2017-10-20 09:40:02 -04:00
Jaron Rolfe
cc5f329d33 Explanation for image removal block and idiomatic handling of var that enables it 2017-10-19 21:43:37 -04:00
Bill Nottingham
31c1e1684d Update tooltip for update-on-launch.
This better describes how this setting is used.
2017-10-19 16:57:50 -04:00
Alan Rominger
083e56d97d Merge pull request #464 from AlanCoding/actiony_permissions
fix bug where JT admins could not edit spec
2017-10-19 16:22:14 -04:00
AlanCoding
098a407e25 fix bug where JT admins could not edit spec 2017-10-19 16:05:44 -04:00
Alan Rominger
6347db56c5 Merge pull request #473 from AlanCoding/test_fix_exc
fix test fallout from 321 merge
2017-10-19 16:05:01 -04:00
AlanCoding
e660879a00 fix test fallout from 321 merge 2017-10-19 15:47:51 -04:00
Matthew Jones
5635f5fb49 Merge branch 'release_3.2.1' into devel
* release_3.2.1:
  fallback to empty dict when processing extra_data
  fix migration problem from 3.1.1
  move 0005a migration to 0005b
  feedback on ad hoc prohibited vars error msg
  Fix the way we include i18n files in sdist
  Fix migrations to support 3.1.2 -> 3.2.1+ upgrade path
  fix missing parameter to update_capacity method
  fix WARNING log when launching ad hoc command
  Validate against ansible variables on ad hoc launch
  do not allow ansible connection type of local for ad_hoc
  work around an ansible 2.4 inventory caching bug
  fix scan job migration unicode issue
  Assert isolated nodes have capacity set to 0 and restored based on version
  Set capacity to zero if the isolated node has an old version
2017-10-19 13:30:26 -04:00
Jared Tabor
0497a4ba96 Merge pull request #396 from jaredevantabor/ui-router
Upgrade Angular UI Router to v1.0.7
2017-10-18 20:31:03 -07:00
Jared Tabor
275e02a8cf fixing issue with scrolling due to UI-router upgrade 2017-10-18 20:16:05 -07:00
Jared Tabor
887a09d052 fixing issue from UI-router upgrade where document title wouldn't update
with the name of the state.
2017-10-18 16:34:39 -07:00
Jared Tabor
02af117f51 adjusting unit tests to pass 2017-10-18 16:34:39 -07:00
Jared Tabor
6e2de1b4b0 changing "dyanmic" to "dynamic" 2017-10-18 16:34:39 -07:00
Jared Tabor
47f743e623 fixing removeTerm for smart search to work 2017-10-18 16:34:39 -07:00
Jared Tabor
4f0fa57a1b removing $urlMatcherFactory and $urlRouter b/c they're deprecated
in favor or $urlService
2017-10-18 16:34:39 -07:00
Jared Tabor
3d5f301a07 removing notify:false, it's deprecated 2017-10-18 16:34:39 -07:00
Jared Tabor
f9c991e660 replacing all stateChangeSuccess for $transition.onSuccess 2017-10-18 16:34:39 -07:00
Jared Tabor
6e3f4a7a6e fixing issues w/ lazyloaded states 2017-10-18 16:34:38 -07:00
Jared Tabor
07139820cb updating angular-ui-router to v. 1.0.7 2017-10-18 16:34:38 -07:00
Ryan Petrello
764356bf47 Merge pull request #459 from ryanpetrello/simplified-inventory-building
remove support for job-scoped auth tokens
2017-10-18 17:35:37 -04:00
Ryan Petrello
ea683344f5 remove support for job-scoped auth tokens
When Jobs and Adhoc Commands are launched, awx uses a job-scoped auth
token to dynamically fetch inventory via the awx REST API; this process
is complicated, hard to debug, and likely won't work going forward with
oauth2-based tokens in awx

see: https://github.com/ansible/awx/issues/21
2017-10-18 17:11:47 -04:00
Jared Tabor
bff13e168a Merge pull request #461 from jaredevantabor/host-event-selecting
Fixing Host Event Modal Selecting
2017-10-18 09:23:46 -07:00
Jared Tabor
774a3da7f4 generalizing class which is ignored when trying to drag the host-event-modal
it was only applied to .CodeMirror, which is only used by the JSON tab
2017-10-17 16:03:39 -07:00
Bill Nottingham
5f3b4575de Merge pull request #456 from wenottingham/i-am-becoming
Set ANSIBLE_BECOME_ASK_PASS to avoid deprecation warning.
2017-10-17 18:11:33 -04:00
Ryan Petrello
59f9967dba store cloudforms inventory cache files in the proper location on disk
with process isolation enabled (which is the awx default), cloudforms
caches inventory script results on disk; awx should direct cloudforms to
store these cache files in a location that's exposed to the isolated
environment

see: ansible/ansible#31760
2017-10-17 17:06:48 -04:00
Bill Nottingham
058475c131 Set ANSIBLE_BECOME_ASK_PASS to avoid deprecation warning. 2017-10-17 16:00:16 -04:00
Chris Meyers
3685cb5517 Merge pull request #440 from chrismeyersfsu/fix-callback_unit_tests
fixes ansible callback import json warning
2017-10-17 13:53:45 -04:00
Chris Meyers
4e2cf62e89 fixes ansible callback import json warning
[WARNING]: Failure using method (v2_runner_on_ok) in callback plugin
(<awx_display_callback.module.AWXDefaultCallbackModule object at
0x47f6090>):
'module' object has no attribute 'dumps'

The above error is thrown by ansible if callback plugins don't respect
the same import precedence configuration as Ansible. ansible callback/*
dir includes a json.py file. This is imported by ansible
callback/__init__.py when a callback plugin implementation imports from
Ansible callback base without setting the correct import precedence.
2017-10-16 10:29:41 -04:00
Jaron Rolfe
5e17d72922 Improve push capabilities and allow build playbook to push 2017-10-16 00:38:28 -04:00
Jaron Rolfe
67df298f21 Replace deprecated "include" with "include_tasks" 2017-10-15 22:15:11 -04:00
Alan Rominger
353a9a55c7 Merge pull request #406 from AlanCoding/variables_debt
Consolidation of variables parsing throughout codebase
2017-10-13 15:40:36 -04:00
Matthew Jones
0ac3598ca5 Merge pull request #431 from matburt/lower_awx_uid
Lower the default uid by which we'll rewrite passwd
2017-10-13 15:33:36 -04:00
Jake McDermott
06f06173b0 Merge pull request #408 from jakemcdermott/smoke-tests
add smoke test
2017-10-13 15:21:00 -04:00
Matthew Jones
da5e6883d4 Lower the default uid by which we'll rewrite passwd
This fixes some issues with openshift under certain security policies
2017-10-13 14:27:30 -04:00
Matthew Jones
ef05df9224 Merge pull request #421 from carbonin/use_http_host_in_slash_redirect
Use $http_host in trailing slash redirect
2017-10-13 14:12:21 -04:00
Ryan Petrello
8b8c0e325f Merge pull request #430 from ryanpetrello/fix-isolated-version
stop hard-coding the awx version in the isolated development environment
2017-10-13 12:35:37 -04:00
Ryan Petrello
5bb06fdb50 stop hard-coding the awx version in the isolated development environment
see: #296
2017-10-13 12:17:04 -04:00
AlanCoding
993fa9290d additional verbosity for vars parsing exceptions 2017-10-13 11:41:11 -04:00
Jim Ladd
5924571904 Merge pull request #384 from jladdjr/awx365_post_response_discrepancy
Address discrepancy in POST response between jobs launches and project / inventory source updates
2017-10-13 09:59:29 -04:00
Chris Meyers
9cc4520a34 Merge pull request #409 from chrismeyersfsu/replay_job_events
add job event replay awx-manage command
2017-10-13 09:39:28 -04:00
Chris Meyers
62987196cb add speedup support to event replay and stats
* add tests
* add verbosity support
2017-10-13 09:25:18 -04:00
Nick Carboni
cfa21af432 Use $http_host in trailing slash redirect
This allows the port from the request header to be used
rather than having the request redirected to the port
being used inside the container which may not be
accessible

Fixes #420
related #420

Signed-off-by: Nick Carboni <ncarboni@redhat.com>
2017-10-12 17:35:55 -04:00
Jim Ladd
6f1c7ee733 Update several endpoints to match JT launch POST response
Signed-off-by: Jim Ladd <jladd@redhat.com>
2017-10-12 17:35:34 -04:00
Aaron Tan
bcd2a8f211 Merge pull request #382 from jangsutsr/fix-264
Implement workflow job failure
2017-10-12 16:34:08 -04:00
Jaron Rolfe
ee15db4c7c allow for private registry without latest tag
The logic that sets awx_web_docker_actual_image and awx_task_docker_actual_image creates and pushes images to the private registry tagged with the awx version, which is appropriate, but then tries to pull with no tag. (so docker defaults to "latest", which does not exist)
2017-10-12 15:57:34 -04:00
Alan Rominger
ad0e43dc52 Merge pull request #379 from AlanCoding/awx160
Enforce max line length of 160 characters
2017-10-12 14:05:31 -04:00
Aaron Tan
5287e5c111 Implement workflow job failure
Relates #264.

This PR proposed and implemented a way of defining workflow failure
state:

A workflow job fails if one of the conditions below satisfies.
* At least one node runs into states `canceled` or `error`.
* At least one leaf node runs into states `failed`, but no child node is
  spawned to run (no error handler).

Signed-off-by: Aaron Tan <jangsutsr@gmail.com>
2017-10-12 11:08:33 -04:00
Chris Meyers
e19a57c50a add job event replay awx-manage command
* awx-manage replay_job_event --job_id <id>
2017-10-12 09:40:30 -04:00
Jake McDermott
113d62a95f add smoke test 2017-10-11 18:28:56 -04:00
Jake McDermott
b5899c193a update object fields and commands 2017-10-11 18:28:36 -04:00
AlanCoding
8b41810189 Consolidation of variables parsing throughout codebase
* Remove attempted support of key=value pattern, because
  it is not actually allowed in practice
* Have variables validator defer to the utils variables parser
* Prune serializers of a handful of cases that previous
  attempts at cleanup have missed
2017-10-11 16:21:50 -04:00
Matthew Jones
f25ab7c6da Merge pull request #403 from jangsutsr/fix-391
Add extra encoding to ldap_dn verification
2017-10-11 14:35:35 -04:00
AlanCoding
f03b40aa50 enforce max line length of 160 characters 2017-10-11 12:38:39 -04:00
Aaron Tan
9dd4c7aaa3 Add extra encoding to ldap_dn verification
Relates #391.

Upstream `python-ldap` (surprisingly) does not support utf-8 DN. So
explicit encoding is needed.

Signed-off-by: Aaron Tan <jangsutsr@gmail.com>
2017-10-11 12:28:26 -04:00
Jake McDermott
d4af743805 Merge pull request #397 from jakemcdermott/unit-linting
additional test de-linting
2017-10-10 20:56:40 -04:00
Jake McDermott
8b395c934c de-lint unit tests 2017-10-10 20:23:15 -04:00
Jake McDermott
ae0855614b update unit test file names 2017-10-10 20:18:05 -04:00
Jake McDermott
169cd1a466 Merge pull request #395 from gconsidine/ui/combine-test-dirs
Ui/combine test dirs
2017-10-10 18:33:12 -04:00
gconsidine
82f81752e4 De-lint test files and update test,build config 2017-10-10 16:59:42 -04:00
gconsidine
8b6cc0e323 Combine test directories 2017-10-10 16:59:42 -04:00
Greg Considine
c0996f5fb1 Merge pull request #394 from gconsidine/ui/fix/closing-curly-brace
Add closing curly brace in strings file
2017-10-10 16:58:41 -04:00
gconsidine
3998796bf0 Add closing curly brace in strings file 2017-10-10 16:42:56 -04:00
Alan Rominger
70f8ec78de Merge pull request #517 from AlanCoding/vars_exception
fallback to empty dict when processing extra_data
2017-10-10 09:30:06 -04:00
AlanCoding
e2c398ade2 fallback to empty dict when processing extra_data 2017-10-10 08:13:45 -04:00
Alan Rominger
7365e4a63f Merge pull request #516 from AlanCoding/abc_migrations
Fix 3.1.1->3.2.1 migration error
2017-10-09 16:37:35 -04:00
AlanCoding
d4fc4bcd61 fix migration problem from 3.1.1 2017-10-09 15:49:38 -04:00
AlanCoding
4237b9ed5c move 0005a migration to 0005b 2017-10-09 15:48:52 -04:00
Alan Rominger
2dd2541550 Merge pull request #507 from AlanCoding/adhoc_word_smithing
[3.2.2] feedback on ad hoc prohibited vars error msg
2017-10-06 14:15:16 -04:00
AlanCoding
edda5e5420 feedback on ad hoc prohibited vars error msg 2017-10-06 14:07:38 -04:00
Matthew Jones
721674f0cd Merge pull request #502 from wwitzel3/release_3.2.1
[3.2.1] Fix broken migration from 3.1.1/3.1.2 -> 3.2.1
2017-10-06 13:19:34 -04:00
Shane McDonald
f97ca9c42f Fix the way we include i18n files in sdist 2017-10-06 11:57:08 -04:00
Wayne Witzel III
8f883d8d43 Fix migrations to support 3.1.2 -> 3.2.1+ upgrade path 2017-10-06 09:25:43 -04:00
Wayne Witzel III
5bcf704b76 Merge pull request #501 from wwitzel3/release_3.2.1
[3.2.1] Fix missing update_capacity paramenter
2017-10-06 00:37:07 -04:00
Wayne Witzel III
2818bb5833 fix missing parameter to update_capacity method 2017-10-06 00:23:12 -04:00
Alan Rominger
e8b79cde4a Merge pull request #499 from AlanCoding/adhoc_local
[3.2.1] Disallow Ansible vars in adhoc launch serializer + wayne's change
2017-10-05 15:30:06 -04:00
AlanCoding
81c14ce942 fix WARNING log when launching ad hoc command 2017-10-05 13:42:23 -04:00
AlanCoding
eacbeef660 Validate against ansible variables on ad hoc launch
Share code between this check for ad hoc and JT callback
2017-10-05 12:14:05 -04:00
Wayne Witzel III
02e3f45422 do not allow ansible connection type of local for ad_hoc 2017-10-04 17:55:36 -04:00
Chris Meyers
00afc87af1 Merge pull request #493 from chrismeyersfsu/fix-scan_job_migrations_unicode2
fix scan job migration unicode issue
2017-10-03 16:28:54 -04:00
Ryan Petrello
e48bebb761 Merge pull request #495 from ryanpetrello/fix-7713
work around an ansible 2.4 inventory caching bug
2017-10-03 16:26:41 -04:00
Ryan Petrello
4c5ec2fb3a work around an ansible 2.4 inventory caching bug
see: https://github.com/ansible/awx/issues/246
2017-10-03 15:45:11 -04:00
Chris Meyers
cad8710ac7 fix scan job migration unicode issue 2017-10-03 11:31:43 -04:00
Wayne Witzel III
96fd07d0f3 Assert isolated nodes have capacity set to 0 and restored based on version 2017-10-02 14:43:19 -04:00
Wayne Witzel III
692007072d Set capacity to zero if the isolated node has an old version 2017-10-02 14:43:18 -04:00
581 changed files with 14143 additions and 7112 deletions

View File

@@ -14,13 +14,11 @@
<!-- Briefly describe the problem. -->
##### ENVIRONMENT
<!--
* AWX version: X.Y.Z
* AWX install method: openshift, minishift, docker on linux, docker for mac, boot2docker
* Ansible version: X.Y.Z
* Operating System:
* Web Browser:
-->
##### STEPS TO REPRODUCE

2
.gitignore vendored
View File

@@ -49,7 +49,7 @@ __pycache__
/.istanbul.yml
**/node_modules/**
/tmp
npm-debug.log
**/npm-debug.log*
# UI build flag files
awx/ui/.deps_built

View File

@@ -8,6 +8,7 @@ This document provides a guide for installing AWX.
- [Clone the repo](#clone-the-repo)
- [AWX branding](#awx-branding)
- [Prerequisites](#prerequisites)
- [System Requirements](#system-requirements)
- [AWX Tunables](#awx-tunables)
- [Choose a deployment platform](#choose-a-deployment-platform)
- [Official vs Building Images](#official-vs-building-images)
@@ -49,12 +50,21 @@ To install the assets, clone the `awx-logos` repo so that it is next to your `aw
Before you can run a deployment, you'll need the following installed in your local environment:
- [Ansible](http://docs.ansible.com/ansible/latest/intro_installation.html) Requires Version 2.3+
- [Ansible](http://docs.ansible.com/ansible/latest/intro_installation.html) Requires Version 2.4+
- [Docker](https://docs.docker.com/engine/installation/)
- [docker-py](https://github.com/docker/docker-py) Python module
- [GNU Make](https://www.gnu.org/software/make/)
- [Git](https://git-scm.com/)
### System Requirements
The system that runs the AWX service will need to satisfy the following requirements
- At leasts 4GB of memory
- At least 2 cpu cores
- At least 20GB of space
- Running Docker or Openshift
### AWX Tunables
**TODO** add tunable bits
@@ -330,6 +340,10 @@ If you wish to tag and push built images to a Docker registry, set the following
> Username of the user that will push images to the registry. Defaults to *developer*.
*docker_remove_local_images*
> Due to the way that the docker_image module behaves, images will not be pushed to a remote repository if they are present locally. Set this to delete local versions of the images that will be pushed to the remote. This will fail if containers are currently running from those images.
**Note**
> These settings are ignored if using official images

View File

@@ -1,4 +1,6 @@
recursive-include awx *.py
recursive-include awx *.po
recursive-include awx *.mo
recursive-include awx/static *
recursive-include awx/templates *.html
recursive-include awx/api/templates *.md *.html

View File

@@ -83,7 +83,9 @@ I18N_FLAG_FILE = .i18n_built
clean-ui:
rm -rf awx/ui/static/
rm -rf awx/ui/node_modules/
rm -rf awx/ui/coverage/
rm -rf awx/ui/test/unit/reports/
rm -rf awx/ui/test/spec/reports/
rm -rf awx/ui/test/e2e/reports/
rm -rf awx/ui/client/languages/
rm -f $(UI_DEPS_FLAG_FILE)
rm -f $(UI_RELEASE_FLAG_FILE)
@@ -201,8 +203,11 @@ develop:
fi
version_file:
mkdir -p /var/lib/awx/
python -c "import awx as awx; print awx.__version__" > /var/lib/awx/.awx_version
mkdir -p /var/lib/awx/; \
if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
python -c "import awx as awx; print awx.__version__" > /var/lib/awx/.awx_version; \
# Do any one-time init tasks.
comma := ,
@@ -282,7 +287,7 @@ flower:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
$(PYTHON) manage.py celery flower --address=0.0.0.0 --port=5555 --broker=amqp://guest:guest@$(RABBITMQ_HOST):5672//
celery flower --address=0.0.0.0 --port=5555 --broker=amqp://guest:guest@$(RABBITMQ_HOST):5672//
collectstatic:
@if [ "$(VENV_BASE)" ]; then \
@@ -320,8 +325,7 @@ celeryd:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
$(PYTHON) manage.py celeryd -l DEBUG -B -Ofair --autoreload --autoscale=100,4 --schedule=$(CELERY_SCHEDULE_FILE) -Q tower_scheduler,tower_broadcast_all,$(COMPOSE_HOST),$(AWX_GROUP_QUEUES) -n celery@$(COMPOSE_HOST)
#$(PYTHON) manage.py celery multi show projects jobs default -l DEBUG -Q:projects projects -Q:jobs jobs -Q:default default -c:projects 1 -c:jobs 3 -c:default 3 -Ofair -B --schedule=$(CELERY_SCHEDULE_FILE)
celery worker -A awx -l DEBUG -B -Ofair --autoscale=100,4 --schedule=$(CELERY_SCHEDULE_FILE) -Q tower_scheduler,tower_broadcast_all,$(COMPOSE_HOST),$(AWX_GROUP_QUEUES) -n celery@$(COMPOSE_HOST)
# Run to start the zeromq callback receiver
receiver:
@@ -330,18 +334,18 @@ receiver:
fi; \
$(PYTHON) manage.py run_callback_receiver
socketservice:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
$(PYTHON) manage.py run_socketio_service
nginx:
nginx -g "daemon off;"
rdb:
$(PYTHON) tools/rdb.py
jupyter:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
$(MANAGEMENT_COMMAND) shell_plus --notebook
reports:
mkdir -p $@
@@ -495,12 +499,14 @@ ui: clean-ui ui-devel
ui-test-ci: $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run test:ci
$(NPM_BIN) --prefix awx/ui run unit
testjs_ci:
echo "Update UI unittests later" #ui-test-ci
jshint: $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) run --prefix awx/ui jshint
$(NPM_BIN) run --prefix awx/ui lint
# END UI TASKS
# --------------------------------------
@@ -545,6 +551,7 @@ docker-isolated:
TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml -f tools/docker-isolated-override.yml create
docker start tools_awx_1
docker start tools_isolated_1
echo "__version__ = '`python setup.py --version`'" | docker exec -i tools_isolated_1 /bin/bash -c "cat > /venv/awx/lib/python2.7/site-packages/awx.py"
if [ "`docker exec -i -t tools_isolated_1 cat /root/.ssh/authorized_keys`" == "`docker exec -t tools_awx_1 cat /root/.ssh/id_rsa.pub`" ]; then \
echo "SSH keys already copied to isolated instance"; \
else \

View File

@@ -1,15 +1,17 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
from __future__ import absolute_import, unicode_literals
import os
import sys
import warnings
from pkg_resources import get_distribution
from .celery import app as celery_app
__version__ = get_distribution('awx').version
__all__ = ['__version__']
__all__ = ['__version__', 'celery_app']
# Check for the presence/absence of "devonly" module to determine if running
# from a source code checkout or release packaage.

View File

@@ -17,7 +17,7 @@ from rest_framework import exceptions
from rest_framework import HTTP_HEADER_ENCODING
# AWX
from awx.main.models import UnifiedJob, AuthToken
from awx.main.models import AuthToken
logger = logging.getLogger('awx.api.authentication')
@@ -137,29 +137,3 @@ class LoggedBasicAuthentication(authentication.BasicAuthentication):
if not settings.AUTH_BASIC_ENABLED:
return
return super(LoggedBasicAuthentication, self).authenticate_header(request)
class TaskAuthentication(authentication.BaseAuthentication):
'''
Custom authentication used for views accessed by the inventory and callback
scripts when running a task.
'''
model = None
def authenticate(self, request):
auth = authentication.get_authorization_header(request).split()
if len(auth) != 2 or auth[0].lower() != 'token' or '-' not in auth[1]:
return None
pk, key = auth[1].split('-', 1)
try:
unified_job = UnifiedJob.objects.get(pk=pk, status='running')
except UnifiedJob.DoesNotExist:
return None
token = unified_job.task_auth_token
if auth[1] != token:
raise exceptions.AuthenticationFailed(_('Invalid task token'))
return (None, token)
def authenticate_header(self, request):
return 'Token'

View File

@@ -22,6 +22,7 @@ from rest_framework.filters import BaseFilterBackend
# AWX
from awx.main.utils import get_type_for_model, to_python_boolean
from awx.main.utils.db import get_all_field_names
from awx.main.models.credential import CredentialType
from awx.main.models.rbac import RoleAncestorEntry
@@ -70,7 +71,7 @@ class TypeFilterBackend(BaseFilterBackend):
types_map[ct_type] = ct.pk
model = queryset.model
model_type = get_type_for_model(model)
if 'polymorphic_ctype' in model._meta.get_all_field_names():
if 'polymorphic_ctype' in get_all_field_names(model):
types_pks = set([v for k,v in types_map.items() if k in types])
queryset = queryset.filter(polymorphic_ctype_id__in=types_pks)
elif model_type in types:
@@ -119,7 +120,7 @@ class FieldLookupBackend(BaseFilterBackend):
'last_updated': 'last_job_run',
}.get(name, name)
if name == 'type' and 'polymorphic_ctype' in model._meta.get_all_field_names():
if name == 'type' and 'polymorphic_ctype' in get_all_field_names(model):
name = 'polymorphic_ctype'
new_parts.append('polymorphic_ctype__model')
else:
@@ -136,7 +137,7 @@ class FieldLookupBackend(BaseFilterBackend):
new_parts.pop()
new_parts.append(name_alt)
else:
field = model._meta.get_field_by_name(name)[0]
field = model._meta.get_field(name)
if isinstance(field, ForeignObjectRel) and getattr(field.field, '__prevent_search__', False):
raise PermissionDenied(_('Filtering on %s is not allowed.' % name))
elif getattr(field, '__prevent_search__', False):
@@ -268,8 +269,10 @@ class FieldLookupBackend(BaseFilterBackend):
# Make legacy v1 Job/Template fields work for backwards compatability
# TODO: remove after API v1 deprecation period
if queryset.model._meta.object_name in ('JobTemplate', 'Job') and key in ('cloud_credential', 'network_credential'):
key = 'extra_credentials'
if queryset.model._meta.object_name in ('JobTemplate', 'Job') and key in (
'credential', 'vault_credential', 'cloud_credential', 'network_credential'
):
key = 'credentials'
# Make legacy v1 Credential fields work for backwards compatability
# TODO: remove after API v1 deprecation period
@@ -375,7 +378,7 @@ class OrderByBackend(BaseFilterBackend):
# given the limited number of views with multiple types,
# sorting on polymorphic_ctype.model is effectively the same.
new_order_by = []
if 'polymorphic_ctype' in queryset.model._meta.get_all_field_names():
if 'polymorphic_ctype' in get_all_field_names(queryset.model):
for field in order_by:
if field == 'type':
new_order_by.append('polymorphic_ctype__model')

View File

@@ -31,6 +31,7 @@ from rest_framework import views
from awx.api.filters import FieldLookupBackend
from awx.main.models import * # noqa
from awx.main.utils import * # noqa
from awx.main.utils.db import get_all_field_names
from awx.api.serializers import ResourceAccessListElementSerializer
from awx.api.versioning import URLPathVersioning, get_request_version
from awx.api.metadata import SublistAttachDetatchMetadata
@@ -188,6 +189,7 @@ class APIView(views.APIView):
'new_in_300': getattr(self, 'new_in_300', False),
'new_in_310': getattr(self, 'new_in_310', False),
'new_in_320': getattr(self, 'new_in_320', False),
'new_in_330': getattr(self, 'new_in_330', False),
'new_in_api_v2': getattr(self, 'new_in_api_v2', False),
'deprecated': getattr(self, 'deprecated', False),
}
@@ -321,8 +323,7 @@ class ListAPIView(generics.ListAPIView, GenericAPIView):
return page
def get_description_context(self):
opts = self.model._meta
if 'username' in opts.get_all_field_names():
if 'username' in get_all_field_names(self.model):
order_field = 'username'
else:
order_field = 'name'

View File

@@ -52,14 +52,18 @@ class ModelAccessPermission(permissions.BasePermission):
if not check_user_access(request.user, view.model, 'add', {view.parent_key: parent_obj}):
return False
return True
elif getattr(view, 'is_job_start', False):
elif hasattr(view, 'obj_permission_type'):
# Generic object-centric view permission check without object not needed
if not obj:
return True
return check_user_access(request.user, view.model, 'start', obj)
elif getattr(view, 'is_job_cancel', False):
if not obj:
return True
return check_user_access(request.user, view.model, 'cancel', obj)
# Permission check that happens when get_object() is called
extra_kwargs = {}
if view.obj_permission_type == 'admin':
extra_kwargs['data'] = {}
return check_user_access(
request.user, view.model, view.obj_permission_type, obj,
**extra_kwargs
)
else:
if obj:
return True

File diff suppressed because it is too large Load Diff

View File

@@ -10,4 +10,5 @@
{% if new_in_300 %}> _Added in Ansible Tower 3.0.0_{% endif %}
{% if new_in_310 %}> _New in Ansible Tower 3.1.0_{% endif %}
{% if new_in_320 %}> _New in Ansible Tower 3.2.0_{% endif %}
{% if new_in_330 %}> _New in Ansible Tower 3.3.0_{% endif %}
{% endif %}

View File

@@ -0,0 +1,12 @@
Create a schedule based on a job:
Make a POST request to this endpoint to create a schedule that launches
the job template that launched this job, and uses the same
parameters that the job was launched with. These parameters include all
"prompted" resources such as `extra_vars`, `inventory`, `limit`, etc.
Jobs that were launched with user-provided passwords cannot have a schedule
created from them.
Make a GET request for information about what those prompts are and
whether or not a schedule can be created.

View File

@@ -26,9 +26,6 @@ The response will include the following fields:
job_template (array, read-only)
* `survey_enabled`: Flag indicating whether the job_template has an enabled
survey (boolean, read-only)
* `credential_needed_to_start`: Flag indicating the presence of a credential
associated with the job template. If not then one should be supplied when
launching the job (boolean, read-only)
* `inventory_needed_to_start`: Flag indicating the presence of an inventory
associated with the job template. If not then one should be supplied when
launching the job (boolean, read-only)
@@ -36,9 +33,8 @@ The response will include the following fields:
Make a POST request to this resource to launch the job_template. If any
passwords, inventory, or extra variables (extra_vars) are required, they must
be passed via POST data, with extra_vars given as a YAML or JSON string and
escaped parentheses. If `credential_needed_to_start` is `True` then the
`credential` field is required and if the `inventory_needed_to_start` is
`True` then the `inventory` is required as well.
escaped parentheses. If the `inventory_needed_to_start` is `True` then the
`inventory` is required.
If successful, the response status code will be 201. If any required passwords
are not provided, a 400 status code will be returned. If the job cannot be

View File

@@ -1,420 +0,0 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
# noqa
from django.conf.urls import include, patterns, url as original_url
def url(regex, view, kwargs=None, name=None, prefix=''):
# Set default name from view name (if a string).
if isinstance(view, basestring) and name is None:
name = view
return original_url(regex, view, kwargs, name, prefix)
organization_urls = patterns('awx.api.views',
url(r'^$', 'organization_list'),
url(r'^(?P<pk>[0-9]+)/$', 'organization_detail'),
url(r'^(?P<pk>[0-9]+)/users/$', 'organization_users_list'),
url(r'^(?P<pk>[0-9]+)/admins/$', 'organization_admins_list'),
url(r'^(?P<pk>[0-9]+)/inventories/$', 'organization_inventories_list'),
url(r'^(?P<pk>[0-9]+)/projects/$', 'organization_projects_list'),
url(r'^(?P<pk>[0-9]+)/workflow_job_templates/$', 'organization_workflow_job_templates_list'),
url(r'^(?P<pk>[0-9]+)/teams/$', 'organization_teams_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', 'organization_credential_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'organization_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates/$', 'organization_notification_templates_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', 'organization_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', 'organization_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', 'organization_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', 'organization_instance_groups_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'organization_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'organization_access_list'),
)
user_urls = patterns('awx.api.views',
url(r'^$', 'user_list'),
url(r'^(?P<pk>[0-9]+)/$', 'user_detail'),
url(r'^(?P<pk>[0-9]+)/teams/$', 'user_teams_list'),
url(r'^(?P<pk>[0-9]+)/organizations/$', 'user_organizations_list'),
url(r'^(?P<pk>[0-9]+)/admin_of_organizations/$', 'user_admin_of_organizations_list'),
url(r'^(?P<pk>[0-9]+)/projects/$', 'user_projects_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', 'user_credentials_list'),
url(r'^(?P<pk>[0-9]+)/roles/$', 'user_roles_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'user_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'user_access_list'),
)
project_urls = patterns('awx.api.views',
url(r'^$', 'project_list'),
url(r'^(?P<pk>[0-9]+)/$', 'project_detail'),
url(r'^(?P<pk>[0-9]+)/playbooks/$', 'project_playbooks'),
url(r'^(?P<pk>[0-9]+)/inventories/$', 'project_inventories'),
url(r'^(?P<pk>[0-9]+)/scm_inventory_sources/$', 'project_scm_inventory_sources'),
url(r'^(?P<pk>[0-9]+)/teams/$', 'project_teams_list'),
url(r'^(?P<pk>[0-9]+)/update/$', 'project_update_view'),
url(r'^(?P<pk>[0-9]+)/project_updates/$', 'project_updates_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'project_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/schedules/$', 'project_schedules_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', 'project_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', 'project_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', 'project_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'project_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'project_access_list'),
)
project_update_urls = patterns('awx.api.views',
url(r'^$', 'project_update_list'),
url(r'^(?P<pk>[0-9]+)/$', 'project_update_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', 'project_update_cancel'),
url(r'^(?P<pk>[0-9]+)/stdout/$', 'project_update_stdout'),
url(r'^(?P<pk>[0-9]+)/scm_inventory_updates/$', 'project_update_scm_inventory_updates'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'project_update_notifications_list'),
)
team_urls = patterns('awx.api.views',
url(r'^$', 'team_list'),
url(r'^(?P<pk>[0-9]+)/$', 'team_detail'),
url(r'^(?P<pk>[0-9]+)/projects/$', 'team_projects_list'),
url(r'^(?P<pk>[0-9]+)/users/$', 'team_users_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', 'team_credentials_list'),
url(r'^(?P<pk>[0-9]+)/roles/$', 'team_roles_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'team_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'team_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'team_access_list'),
)
inventory_urls = patterns('awx.api.views',
url(r'^$', 'inventory_list'),
url(r'^(?P<pk>[0-9]+)/$', 'inventory_detail'),
url(r'^(?P<pk>[0-9]+)/hosts/$', 'inventory_hosts_list'),
url(r'^(?P<pk>[0-9]+)/groups/$', 'inventory_groups_list'),
url(r'^(?P<pk>[0-9]+)/root_groups/$', 'inventory_root_groups_list'),
url(r'^(?P<pk>[0-9]+)/variable_data/$', 'inventory_variable_data'),
url(r'^(?P<pk>[0-9]+)/script/$', 'inventory_script_view'),
url(r'^(?P<pk>[0-9]+)/tree/$', 'inventory_tree_view'),
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', 'inventory_inventory_sources_list'),
url(r'^(?P<pk>[0-9]+)/update_inventory_sources/$', 'inventory_inventory_sources_update'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'inventory_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/job_templates/$', 'inventory_job_template_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', 'inventory_ad_hoc_commands_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'inventory_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'inventory_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', 'inventory_instance_groups_list'),
#url(r'^(?P<pk>[0-9]+)/single_fact/$', 'inventory_single_fact_view'),
)
host_urls = patterns('awx.api.views',
url(r'^$', 'host_list'),
url(r'^(?P<pk>[0-9]+)/$', 'host_detail'),
url(r'^(?P<pk>[0-9]+)/variable_data/$', 'host_variable_data'),
url(r'^(?P<pk>[0-9]+)/groups/$', 'host_groups_list'),
url(r'^(?P<pk>[0-9]+)/all_groups/$', 'host_all_groups_list'),
url(r'^(?P<pk>[0-9]+)/job_events/', 'host_job_events_list'),
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', 'host_job_host_summaries_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'host_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', 'host_inventory_sources_list'),
url(r'^(?P<pk>[0-9]+)/smart_inventories/$', 'host_smart_inventories_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', 'host_ad_hoc_commands_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_command_events/$', 'host_ad_hoc_command_events_list'),
#url(r'^(?P<pk>[0-9]+)/single_fact/$', 'host_single_fact_view'),
url(r'^(?P<pk>[0-9]+)/fact_versions/$', 'host_fact_versions_list'),
url(r'^(?P<pk>[0-9]+)/fact_view/$', 'host_fact_compare_view'),
url(r'^(?P<pk>[0-9]+)/insights/$', 'host_insights'),
)
group_urls = patterns('awx.api.views',
url(r'^$', 'group_list'),
url(r'^(?P<pk>[0-9]+)/$', 'group_detail'),
url(r'^(?P<pk>[0-9]+)/children/$', 'group_children_list'),
url(r'^(?P<pk>[0-9]+)/hosts/$', 'group_hosts_list'),
url(r'^(?P<pk>[0-9]+)/all_hosts/$', 'group_all_hosts_list'),
url(r'^(?P<pk>[0-9]+)/variable_data/$', 'group_variable_data'),
url(r'^(?P<pk>[0-9]+)/job_events/$', 'group_job_events_list'),
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', 'group_job_host_summaries_list'),
url(r'^(?P<pk>[0-9]+)/potential_children/$', 'group_potential_children_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'group_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', 'group_inventory_sources_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', 'group_ad_hoc_commands_list'),
#url(r'^(?P<pk>[0-9]+)/single_fact/$', 'group_single_fact_view'),
)
inventory_source_urls = patterns('awx.api.views',
url(r'^$', 'inventory_source_list'),
url(r'^(?P<pk>[0-9]+)/$', 'inventory_source_detail'),
url(r'^(?P<pk>[0-9]+)/update/$', 'inventory_source_update_view'),
url(r'^(?P<pk>[0-9]+)/inventory_updates/$', 'inventory_source_updates_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'inventory_source_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/schedules/$', 'inventory_source_schedules_list'),
url(r'^(?P<pk>[0-9]+)/groups/$', 'inventory_source_groups_list'),
url(r'^(?P<pk>[0-9]+)/hosts/$', 'inventory_source_hosts_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', 'inventory_source_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', 'inventory_source_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', 'inventory_source_notification_templates_success_list'),
)
inventory_update_urls = patterns('awx.api.views',
url(r'^$', 'inventory_update_list'),
url(r'^(?P<pk>[0-9]+)/$', 'inventory_update_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', 'inventory_update_cancel'),
url(r'^(?P<pk>[0-9]+)/stdout/$', 'inventory_update_stdout'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'inventory_update_notifications_list'),
)
inventory_script_urls = patterns('awx.api.views',
url(r'^$', 'inventory_script_list'),
url(r'^(?P<pk>[0-9]+)/$', 'inventory_script_detail'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'inventory_script_object_roles_list'),
)
credential_type_urls = patterns('awx.api.views',
url(r'^$', 'credential_type_list'),
url(r'^(?P<pk>[0-9]+)/$', 'credential_type_detail'),
url(r'^(?P<pk>[0-9]+)/credentials/$', 'credential_type_credential_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'credential_type_activity_stream_list'),
)
credential_urls = patterns('awx.api.views',
url(r'^$', 'credential_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'credential_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/$', 'credential_detail'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'credential_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'credential_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/owner_users/$', 'credential_owner_users_list'),
url(r'^(?P<pk>[0-9]+)/owner_teams/$', 'credential_owner_teams_list'),
# See also credentials resources on users/teams.
)
role_urls = patterns('awx.api.views',
url(r'^$', 'role_list'),
url(r'^(?P<pk>[0-9]+)/$', 'role_detail'),
url(r'^(?P<pk>[0-9]+)/users/$', 'role_users_list'),
url(r'^(?P<pk>[0-9]+)/teams/$', 'role_teams_list'),
url(r'^(?P<pk>[0-9]+)/parents/$', 'role_parents_list'),
url(r'^(?P<pk>[0-9]+)/children/$', 'role_children_list'),
)
job_template_urls = patterns('awx.api.views',
url(r'^$', 'job_template_list'),
url(r'^(?P<pk>[0-9]+)/$', 'job_template_detail'),
url(r'^(?P<pk>[0-9]+)/launch/$', 'job_template_launch'),
url(r'^(?P<pk>[0-9]+)/jobs/$', 'job_template_jobs_list'),
url(r'^(?P<pk>[0-9]+)/callback/$', 'job_template_callback'),
url(r'^(?P<pk>[0-9]+)/schedules/$', 'job_template_schedules_list'),
url(r'^(?P<pk>[0-9]+)/survey_spec/$', 'job_template_survey_spec'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'job_template_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', 'job_template_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', 'job_template_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', 'job_template_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', 'job_template_instance_groups_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'job_template_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'job_template_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', 'job_template_label_list'),
)
job_urls = patterns('awx.api.views',
url(r'^$', 'job_list'),
url(r'^(?P<pk>[0-9]+)/$', 'job_detail'),
url(r'^(?P<pk>[0-9]+)/start/$', 'job_start'), # TODO: remove in 3.3
url(r'^(?P<pk>[0-9]+)/cancel/$', 'job_cancel'),
url(r'^(?P<pk>[0-9]+)/relaunch/$', 'job_relaunch'),
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', 'job_job_host_summaries_list'),
url(r'^(?P<pk>[0-9]+)/job_events/$', 'job_job_events_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'job_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/stdout/$', 'job_stdout'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'job_notifications_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', 'job_label_list'),
)
job_host_summary_urls = patterns('awx.api.views',
url(r'^(?P<pk>[0-9]+)/$', 'job_host_summary_detail'),
)
job_event_urls = patterns('awx.api.views',
url(r'^$', 'job_event_list'),
url(r'^(?P<pk>[0-9]+)/$', 'job_event_detail'),
url(r'^(?P<pk>[0-9]+)/children/$', 'job_event_children_list'),
url(r'^(?P<pk>[0-9]+)/hosts/$', 'job_event_hosts_list'),
)
ad_hoc_command_urls = patterns('awx.api.views',
url(r'^$', 'ad_hoc_command_list'),
url(r'^(?P<pk>[0-9]+)/$', 'ad_hoc_command_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', 'ad_hoc_command_cancel'),
url(r'^(?P<pk>[0-9]+)/relaunch/$', 'ad_hoc_command_relaunch'),
url(r'^(?P<pk>[0-9]+)/events/$', 'ad_hoc_command_ad_hoc_command_events_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'ad_hoc_command_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'ad_hoc_command_notifications_list'),
url(r'^(?P<pk>[0-9]+)/stdout/$', 'ad_hoc_command_stdout'),
)
ad_hoc_command_event_urls = patterns('awx.api.views',
url(r'^$', 'ad_hoc_command_event_list'),
url(r'^(?P<pk>[0-9]+)/$', 'ad_hoc_command_event_detail'),
)
system_job_template_urls = patterns('awx.api.views',
url(r'^$', 'system_job_template_list'),
url(r'^(?P<pk>[0-9]+)/$', 'system_job_template_detail'),
url(r'^(?P<pk>[0-9]+)/launch/$', 'system_job_template_launch'),
url(r'^(?P<pk>[0-9]+)/jobs/$', 'system_job_template_jobs_list'),
url(r'^(?P<pk>[0-9]+)/schedules/$', 'system_job_template_schedules_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', 'system_job_template_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', 'system_job_template_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', 'system_job_template_notification_templates_success_list'),
)
system_job_urls = patterns('awx.api.views',
url(r'^$', 'system_job_list'),
url(r'^(?P<pk>[0-9]+)/$', 'system_job_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', 'system_job_cancel'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'system_job_notifications_list'),
)
workflow_job_template_urls = patterns('awx.api.views',
url(r'^$', 'workflow_job_template_list'),
url(r'^(?P<pk>[0-9]+)/$', 'workflow_job_template_detail'),
url(r'^(?P<pk>[0-9]+)/workflow_jobs/$', 'workflow_job_template_jobs_list'),
url(r'^(?P<pk>[0-9]+)/launch/$', 'workflow_job_template_launch'),
url(r'^(?P<pk>[0-9]+)/copy/$', 'workflow_job_template_copy'),
url(r'^(?P<pk>[0-9]+)/schedules/$', 'workflow_job_template_schedules_list'),
url(r'^(?P<pk>[0-9]+)/survey_spec/$', 'workflow_job_template_survey_spec'),
url(r'^(?P<pk>[0-9]+)/workflow_nodes/$', 'workflow_job_template_workflow_nodes_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'workflow_job_template_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', 'workflow_job_template_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', 'workflow_job_template_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', 'workflow_job_template_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', 'workflow_job_template_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', 'workflow_job_template_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', 'workflow_job_template_label_list'),
)
workflow_job_urls = patterns('awx.api.views',
url(r'^$', 'workflow_job_list'),
url(r'^(?P<pk>[0-9]+)/$', 'workflow_job_detail'),
url(r'^(?P<pk>[0-9]+)/workflow_nodes/$', 'workflow_job_workflow_nodes_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', 'workflow_job_label_list'),
url(r'^(?P<pk>[0-9]+)/cancel/$', 'workflow_job_cancel'),
url(r'^(?P<pk>[0-9]+)/relaunch/$', 'workflow_job_relaunch'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'workflow_job_notifications_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', 'workflow_job_activity_stream_list'),
)
notification_template_urls = patterns('awx.api.views',
url(r'^$', 'notification_template_list'),
url(r'^(?P<pk>[0-9]+)/$', 'notification_template_detail'),
url(r'^(?P<pk>[0-9]+)/test/$', 'notification_template_test'),
url(r'^(?P<pk>[0-9]+)/notifications/$', 'notification_template_notification_list'),
)
notification_urls = patterns('awx.api.views',
url(r'^$', 'notification_list'),
url(r'^(?P<pk>[0-9]+)/$', 'notification_detail'),
)
label_urls = patterns('awx.api.views',
url(r'^$', 'label_list'),
url(r'^(?P<pk>[0-9]+)/$', 'label_detail'),
)
workflow_job_template_node_urls = patterns('awx.api.views',
url(r'^$', 'workflow_job_template_node_list'),
url(r'^(?P<pk>[0-9]+)/$', 'workflow_job_template_node_detail'),
url(r'^(?P<pk>[0-9]+)/success_nodes/$', 'workflow_job_template_node_success_nodes_list'),
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', 'workflow_job_template_node_failure_nodes_list'),
url(r'^(?P<pk>[0-9]+)/always_nodes/$', 'workflow_job_template_node_always_nodes_list'),
)
workflow_job_node_urls = patterns('awx.api.views',
url(r'^$', 'workflow_job_node_list'),
url(r'^(?P<pk>[0-9]+)/$', 'workflow_job_node_detail'),
url(r'^(?P<pk>[0-9]+)/success_nodes/$', 'workflow_job_node_success_nodes_list'),
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', 'workflow_job_node_failure_nodes_list'),
url(r'^(?P<pk>[0-9]+)/always_nodes/$', 'workflow_job_node_always_nodes_list'),
)
schedule_urls = patterns('awx.api.views',
url(r'^$', 'schedule_list'),
url(r'^(?P<pk>[0-9]+)/$', 'schedule_detail'),
url(r'^(?P<pk>[0-9]+)/jobs/$', 'schedule_unified_jobs_list'),
)
activity_stream_urls = patterns('awx.api.views',
url(r'^$', 'activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/$', 'activity_stream_detail'),
)
instance_urls = patterns('awx.api.views',
url(r'^$', 'instance_list'),
url(r'^(?P<pk>[0-9]+)/$', 'instance_detail'),
url(r'^(?P<pk>[0-9]+)/jobs/$', 'instance_unified_jobs_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', 'instance_instance_groups_list'),
)
instance_group_urls = patterns('awx.api.views',
url(r'^$', 'instance_group_list'),
url(r'^(?P<pk>[0-9]+)/$', 'instance_group_detail'),
url(r'^(?P<pk>[0-9]+)/jobs/$', 'instance_group_unified_jobs_list'),
url(r'^(?P<pk>[0-9]+)/instances/$', 'instance_group_instance_list'),
)
v1_urls = patterns('awx.api.views',
url(r'^$', 'api_v1_root_view'),
url(r'^ping/$', 'api_v1_ping_view'),
url(r'^config/$', 'api_v1_config_view'),
url(r'^auth/$', 'auth_view'),
url(r'^authtoken/$', 'auth_token_view'),
url(r'^me/$', 'user_me_list'),
url(r'^dashboard/$', 'dashboard_view'),
url(r'^dashboard/graphs/jobs/$','dashboard_jobs_graph_view'),
url(r'^settings/', include('awx.conf.urls')),
url(r'^instances/', include(instance_urls)),
url(r'^instance_groups/', include(instance_group_urls)),
url(r'^schedules/', include(schedule_urls)),
url(r'^organizations/', include(organization_urls)),
url(r'^users/', include(user_urls)),
url(r'^projects/', include(project_urls)),
url(r'^project_updates/', include(project_update_urls)),
url(r'^teams/', include(team_urls)),
url(r'^inventories/', include(inventory_urls)),
url(r'^hosts/', include(host_urls)),
url(r'^groups/', include(group_urls)),
url(r'^inventory_sources/', include(inventory_source_urls)),
url(r'^inventory_updates/', include(inventory_update_urls)),
url(r'^inventory_scripts/', include(inventory_script_urls)),
url(r'^credentials/', include(credential_urls)),
url(r'^roles/', include(role_urls)),
url(r'^job_templates/', include(job_template_urls)),
url(r'^jobs/', include(job_urls)),
url(r'^job_host_summaries/', include(job_host_summary_urls)),
url(r'^job_events/', include(job_event_urls)),
url(r'^ad_hoc_commands/', include(ad_hoc_command_urls)),
url(r'^ad_hoc_command_events/', include(ad_hoc_command_event_urls)),
url(r'^system_job_templates/', include(system_job_template_urls)),
url(r'^system_jobs/', include(system_job_urls)),
url(r'^notification_templates/', include(notification_template_urls)),
url(r'^notifications/', include(notification_urls)),
url(r'^workflow_job_templates/',include(workflow_job_template_urls)),
url(r'^workflow_jobs/' ,include(workflow_job_urls)),
url(r'^labels/', include(label_urls)),
url(r'^workflow_job_template_nodes/', include(workflow_job_template_node_urls)),
url(r'^workflow_job_nodes/', include(workflow_job_node_urls)),
url(r'^unified_job_templates/$','unified_job_template_list'),
url(r'^unified_jobs/$', 'unified_job_list'),
url(r'^activity_stream/', include(activity_stream_urls)),
)
v2_urls = patterns('awx.api.views',
url(r'^$', 'api_v2_root_view'),
url(r'^credential_types/', include(credential_type_urls)),
url(r'^hosts/(?P<pk>[0-9]+)/ansible_facts/$', 'host_ansible_facts_detail'),
url(r'^jobs/(?P<pk>[0-9]+)/extra_credentials/$', 'job_extra_credentials_list'),
url(r'^job_templates/(?P<pk>[0-9]+)/extra_credentials/$', 'job_template_extra_credentials_list'),
)
urlpatterns = patterns('awx.api.views',
url(r'^$', 'api_root_view'),
url(r'^(?P<version>(v2))/', include(v2_urls)),
url(r'^(?P<version>(v1|v2))/', include(v1_urls))
)

0
awx/api/urls/Pipfile Normal file
View File

7
awx/api/urls/__init__.py Normal file
View File

@@ -0,0 +1,7 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from __future__ import absolute_import, unicode_literals
from .urls import urlpatterns
__all__ = ['urlpatterns']

View File

@@ -0,0 +1,17 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
ActivityStreamList,
ActivityStreamDetail,
)
urls = [
url(r'^$', ActivityStreamList.as_view(), name='activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/$', ActivityStreamDetail.as_view(), name='activity_stream_detail'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,29 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
AdHocCommandList,
AdHocCommandDetail,
AdHocCommandCancel,
AdHocCommandRelaunch,
AdHocCommandAdHocCommandEventsList,
AdHocCommandActivityStreamList,
AdHocCommandNotificationsList,
AdHocCommandStdout,
)
urls = [
url(r'^$', AdHocCommandList.as_view(), name='ad_hoc_command_list'),
url(r'^(?P<pk>[0-9]+)/$', AdHocCommandDetail.as_view(), name='ad_hoc_command_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', AdHocCommandCancel.as_view(), name='ad_hoc_command_cancel'),
url(r'^(?P<pk>[0-9]+)/relaunch/$', AdHocCommandRelaunch.as_view(), name='ad_hoc_command_relaunch'),
url(r'^(?P<pk>[0-9]+)/events/$', AdHocCommandAdHocCommandEventsList.as_view(), name='ad_hoc_command_ad_hoc_command_events_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', AdHocCommandActivityStreamList.as_view(), name='ad_hoc_command_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notifications/$', AdHocCommandNotificationsList.as_view(), name='ad_hoc_command_notifications_list'),
url(r'^(?P<pk>[0-9]+)/stdout/$', AdHocCommandStdout.as_view(), name='ad_hoc_command_stdout'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,17 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
AdHocCommandEventList,
AdHocCommandEventDetail,
)
urls = [
url(r'^$', AdHocCommandEventList.as_view(), name='ad_hoc_command_event_list'),
url(r'^(?P<pk>[0-9]+)/$', AdHocCommandEventDetail.as_view(), name='ad_hoc_command_event_detail'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,27 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
CredentialList,
CredentialActivityStreamList,
CredentialDetail,
CredentialAccessList,
CredentialObjectRolesList,
CredentialOwnerUsersList,
CredentialOwnerTeamsList,
)
urls = [
url(r'^$', CredentialList.as_view(), name='credential_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', CredentialActivityStreamList.as_view(), name='credential_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/$', CredentialDetail.as_view(), name='credential_detail'),
url(r'^(?P<pk>[0-9]+)/access_list/$', CredentialAccessList.as_view(), name='credential_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', CredentialObjectRolesList.as_view(), name='credential_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/owner_users/$', CredentialOwnerUsersList.as_view(), name='credential_owner_users_list'),
url(r'^(?P<pk>[0-9]+)/owner_teams/$', CredentialOwnerTeamsList.as_view(), name='credential_owner_teams_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
CredentialTypeList,
CredentialTypeDetail,
CredentialTypeCredentialList,
CredentialTypeActivityStreamList,
)
urls = [
url(r'^$', CredentialTypeList.as_view(), name='credential_type_list'),
url(r'^(?P<pk>[0-9]+)/$', CredentialTypeDetail.as_view(), name='credential_type_detail'),
url(r'^(?P<pk>[0-9]+)/credentials/$', CredentialTypeCredentialList.as_view(), name='credential_type_credential_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', CredentialTypeActivityStreamList.as_view(), name='credential_type_activity_stream_list'),
]
__all__ = ['urls']

37
awx/api/urls/group.py Normal file
View File

@@ -0,0 +1,37 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
GroupList,
GroupDetail,
GroupChildrenList,
GroupHostsList,
GroupAllHostsList,
GroupVariableData,
GroupJobEventsList,
GroupJobHostSummariesList,
GroupPotentialChildrenList,
GroupActivityStreamList,
GroupInventorySourcesList,
GroupAdHocCommandsList,
)
urls = [
url(r'^$', GroupList.as_view(), name='group_list'),
url(r'^(?P<pk>[0-9]+)/$', GroupDetail.as_view(), name='group_detail'),
url(r'^(?P<pk>[0-9]+)/children/$', GroupChildrenList.as_view(), name='group_children_list'),
url(r'^(?P<pk>[0-9]+)/hosts/$', GroupHostsList.as_view(), name='group_hosts_list'),
url(r'^(?P<pk>[0-9]+)/all_hosts/$', GroupAllHostsList.as_view(), name='group_all_hosts_list'),
url(r'^(?P<pk>[0-9]+)/variable_data/$', GroupVariableData.as_view(), name='group_variable_data'),
url(r'^(?P<pk>[0-9]+)/job_events/$', GroupJobEventsList.as_view(), name='group_job_events_list'),
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', GroupJobHostSummariesList.as_view(), name='group_job_host_summaries_list'),
url(r'^(?P<pk>[0-9]+)/potential_children/$', GroupPotentialChildrenList.as_view(), name='group_potential_children_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', GroupActivityStreamList.as_view(), name='group_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', GroupInventorySourcesList.as_view(), name='group_inventory_sources_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', GroupAdHocCommandsList.as_view(), name='group_ad_hoc_commands_list'),
]
__all__ = ['urls']

43
awx/api/urls/host.py Normal file
View File

@@ -0,0 +1,43 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
HostList,
HostDetail,
HostVariableData,
HostGroupsList,
HostAllGroupsList,
HostJobEventsList,
HostJobHostSummariesList,
HostActivityStreamList,
HostInventorySourcesList,
HostSmartInventoriesList,
HostAdHocCommandsList,
HostAdHocCommandEventsList,
HostFactVersionsList,
HostFactCompareView,
HostInsights,
)
urls = [
url(r'^$', HostList.as_view(), name='host_list'),
url(r'^(?P<pk>[0-9]+)/$', HostDetail.as_view(), name='host_detail'),
url(r'^(?P<pk>[0-9]+)/variable_data/$', HostVariableData.as_view(), name='host_variable_data'),
url(r'^(?P<pk>[0-9]+)/groups/$', HostGroupsList.as_view(), name='host_groups_list'),
url(r'^(?P<pk>[0-9]+)/all_groups/$', HostAllGroupsList.as_view(), name='host_all_groups_list'),
url(r'^(?P<pk>[0-9]+)/job_events/', HostJobEventsList.as_view(), name='host_job_events_list'),
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', HostJobHostSummariesList.as_view(), name='host_job_host_summaries_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', HostActivityStreamList.as_view(), name='host_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', HostInventorySourcesList.as_view(), name='host_inventory_sources_list'),
url(r'^(?P<pk>[0-9]+)/smart_inventories/$', HostSmartInventoriesList.as_view(), name='host_smart_inventories_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', HostAdHocCommandsList.as_view(), name='host_ad_hoc_commands_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_command_events/$', HostAdHocCommandEventsList.as_view(), name='host_ad_hoc_command_events_list'),
url(r'^(?P<pk>[0-9]+)/fact_versions/$', HostFactVersionsList.as_view(), name='host_fact_versions_list'),
url(r'^(?P<pk>[0-9]+)/fact_view/$', HostFactCompareView.as_view(), name='host_fact_compare_view'),
url(r'^(?P<pk>[0-9]+)/insights/$', HostInsights.as_view(), name='host_insights'),
]
__all__ = ['urls']

22
awx/api/urls/instance.py Normal file
View File

@@ -0,0 +1,22 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
InstanceList,
InstanceDetail,
InstanceUnifiedJobsList,
InstanceInstanceGroupsList,
)
urls = [
url(r'^$', InstanceList.as_view(), name='instance_list'),
url(r'^(?P<pk>[0-9]+)/$', InstanceDetail.as_view(), name='instance_detail'),
url(r'^(?P<pk>[0-9]+)/jobs/$', InstanceUnifiedJobsList.as_view(), name='instance_unified_jobs_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', InstanceInstanceGroupsList.as_view(),
name='instance_instance_groups_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
InstanceGroupList,
InstanceGroupDetail,
InstanceGroupUnifiedJobsList,
InstanceGroupInstanceList,
)
urls = [
url(r'^$', InstanceGroupList.as_view(), name='instance_group_list'),
url(r'^(?P<pk>[0-9]+)/$', InstanceGroupDetail.as_view(), name='instance_group_detail'),
url(r'^(?P<pk>[0-9]+)/jobs/$', InstanceGroupUnifiedJobsList.as_view(), name='instance_group_unified_jobs_list'),
url(r'^(?P<pk>[0-9]+)/instances/$', InstanceGroupInstanceList.as_view(), name='instance_group_instance_list'),
]
__all__ = ['urls']

45
awx/api/urls/inventory.py Normal file
View File

@@ -0,0 +1,45 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
InventoryList,
InventoryDetail,
InventoryHostsList,
InventoryGroupsList,
InventoryRootGroupsList,
InventoryVariableData,
InventoryScriptView,
InventoryTreeView,
InventoryInventorySourcesList,
InventoryInventorySourcesUpdate,
InventoryActivityStreamList,
InventoryJobTemplateList,
InventoryAdHocCommandsList,
InventoryAccessList,
InventoryObjectRolesList,
InventoryInstanceGroupsList,
)
urls = [
url(r'^$', InventoryList.as_view(), name='inventory_list'),
url(r'^(?P<pk>[0-9]+)/$', InventoryDetail.as_view(), name='inventory_detail'),
url(r'^(?P<pk>[0-9]+)/hosts/$', InventoryHostsList.as_view(), name='inventory_hosts_list'),
url(r'^(?P<pk>[0-9]+)/groups/$', InventoryGroupsList.as_view(), name='inventory_groups_list'),
url(r'^(?P<pk>[0-9]+)/root_groups/$', InventoryRootGroupsList.as_view(), name='inventory_root_groups_list'),
url(r'^(?P<pk>[0-9]+)/variable_data/$', InventoryVariableData.as_view(), name='inventory_variable_data'),
url(r'^(?P<pk>[0-9]+)/script/$', InventoryScriptView.as_view(), name='inventory_script_view'),
url(r'^(?P<pk>[0-9]+)/tree/$', InventoryTreeView.as_view(), name='inventory_tree_view'),
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', InventoryInventorySourcesList.as_view(), name='inventory_inventory_sources_list'),
url(r'^(?P<pk>[0-9]+)/update_inventory_sources/$', InventoryInventorySourcesUpdate.as_view(), name='inventory_inventory_sources_update'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', InventoryActivityStreamList.as_view(), name='inventory_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/job_templates/$', InventoryJobTemplateList.as_view(), name='inventory_job_template_list'),
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', InventoryAdHocCommandsList.as_view(), name='inventory_ad_hoc_commands_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', InventoryAccessList.as_view(), name='inventory_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', InventoryObjectRolesList.as_view(), name='inventory_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', InventoryInstanceGroupsList.as_view(), name='inventory_instance_groups_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,19 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
InventoryScriptList,
InventoryScriptDetail,
InventoryScriptObjectRolesList,
)
urls = [
url(r'^$', InventoryScriptList.as_view(), name='inventory_script_list'),
url(r'^(?P<pk>[0-9]+)/$', InventoryScriptDetail.as_view(), name='inventory_script_detail'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', InventoryScriptObjectRolesList.as_view(), name='inventory_script_object_roles_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,38 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
InventorySourceList,
InventorySourceDetail,
InventorySourceUpdateView,
InventorySourceUpdatesList,
InventorySourceActivityStreamList,
InventorySourceSchedulesList,
InventorySourceGroupsList,
InventorySourceHostsList,
InventorySourceNotificationTemplatesAnyList,
InventorySourceNotificationTemplatesErrorList,
InventorySourceNotificationTemplatesSuccessList,
)
urls = [
url(r'^$', InventorySourceList.as_view(), name='inventory_source_list'),
url(r'^(?P<pk>[0-9]+)/$', InventorySourceDetail.as_view(), name='inventory_source_detail'),
url(r'^(?P<pk>[0-9]+)/update/$', InventorySourceUpdateView.as_view(), name='inventory_source_update_view'),
url(r'^(?P<pk>[0-9]+)/inventory_updates/$', InventorySourceUpdatesList.as_view(), name='inventory_source_updates_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', InventorySourceActivityStreamList.as_view(), name='inventory_source_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/schedules/$', InventorySourceSchedulesList.as_view(), name='inventory_source_schedules_list'),
url(r'^(?P<pk>[0-9]+)/groups/$', InventorySourceGroupsList.as_view(), name='inventory_source_groups_list'),
url(r'^(?P<pk>[0-9]+)/hosts/$', InventorySourceHostsList.as_view(), name='inventory_source_hosts_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', InventorySourceNotificationTemplatesAnyList.as_view(),
name='inventory_source_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', InventorySourceNotificationTemplatesErrorList.as_view(),
name='inventory_source_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', InventorySourceNotificationTemplatesSuccessList.as_view(),
name='inventory_source_notification_templates_success_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,23 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
InventoryUpdateList,
InventoryUpdateDetail,
InventoryUpdateCancel,
InventoryUpdateStdout,
InventoryUpdateNotificationsList,
)
urls = [
url(r'^$', InventoryUpdateList.as_view(), name='inventory_update_list'),
url(r'^(?P<pk>[0-9]+)/$', InventoryUpdateDetail.as_view(), name='inventory_update_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', InventoryUpdateCancel.as_view(), name='inventory_update_cancel'),
url(r'^(?P<pk>[0-9]+)/stdout/$', InventoryUpdateStdout.as_view(), name='inventory_update_stdout'),
url(r'^(?P<pk>[0-9]+)/notifications/$', InventoryUpdateNotificationsList.as_view(), name='inventory_update_notifications_list'),
]
__all__ = ['urls']

39
awx/api/urls/job.py Normal file
View File

@@ -0,0 +1,39 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
JobList,
JobDetail,
JobStart,
JobCancel,
JobRelaunch,
JobCreateSchedule,
JobJobHostSummariesList,
JobJobEventsList,
JobActivityStreamList,
JobStdout,
JobNotificationsList,
JobLabelList,
JobHostSummaryDetail,
)
urls = [
url(r'^$', JobList.as_view(), name='job_list'),
url(r'^(?P<pk>[0-9]+)/$', JobDetail.as_view(), name='job_detail'),
url(r'^(?P<pk>[0-9]+)/start/$', JobStart.as_view(), name='job_start'), # Todo: Remove In 3.3
url(r'^(?P<pk>[0-9]+)/cancel/$', JobCancel.as_view(), name='job_cancel'),
url(r'^(?P<pk>[0-9]+)/relaunch/$', JobRelaunch.as_view(), name='job_relaunch'),
url(r'^(?P<pk>[0-9]+)/create_schedule/$', JobCreateSchedule.as_view(), name='job_create_schedule'),
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', JobJobHostSummariesList.as_view(), name='job_job_host_summaries_list'),
url(r'^(?P<pk>[0-9]+)/job_events/$', JobJobEventsList.as_view(), name='job_job_events_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', JobActivityStreamList.as_view(), name='job_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/stdout/$', JobStdout.as_view(), name='job_stdout'),
url(r'^(?P<pk>[0-9]+)/notifications/$', JobNotificationsList.as_view(), name='job_notifications_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', JobLabelList.as_view(), name='job_label_list'),
url(r'^(?P<pk>[0-9]+)/$', JobHostSummaryDetail.as_view(), name='job_host_summary_detail'),
]
__all__ = ['urls']

21
awx/api/urls/job_event.py Normal file
View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
JobEventList,
JobEventDetail,
JobEventChildrenList,
JobEventHostsList,
)
urls = [
url(r'^$', JobEventList.as_view(), name='job_event_list'),
url(r'^(?P<pk>[0-9]+)/$', JobEventDetail.as_view(), name='job_event_detail'),
url(r'^(?P<pk>[0-9]+)/children/$', JobEventChildrenList.as_view(), name='job_event_children_list'),
url(r'^(?P<pk>[0-9]+)/hosts/$', JobEventHostsList.as_view(), name='job_event_hosts_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,15 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
JobHostSummaryDetail,
)
urls = [
url(r'^(?P<pk>[0-9]+)/$', JobHostSummaryDetail.as_view(), name='job_host_summary_detail'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,46 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
JobTemplateList,
JobTemplateDetail,
JobTemplateLaunch,
JobTemplateJobsList,
JobTemplateCallback,
JobTemplateSchedulesList,
JobTemplateSurveySpec,
JobTemplateActivityStreamList,
JobTemplateNotificationTemplatesAnyList,
JobTemplateNotificationTemplatesErrorList,
JobTemplateNotificationTemplatesSuccessList,
JobTemplateInstanceGroupsList,
JobTemplateAccessList,
JobTemplateObjectRolesList,
JobTemplateLabelList,
)
urls = [
url(r'^$', JobTemplateList.as_view(), name='job_template_list'),
url(r'^(?P<pk>[0-9]+)/$', JobTemplateDetail.as_view(), name='job_template_detail'),
url(r'^(?P<pk>[0-9]+)/launch/$', JobTemplateLaunch.as_view(), name='job_template_launch'),
url(r'^(?P<pk>[0-9]+)/jobs/$', JobTemplateJobsList.as_view(), name='job_template_jobs_list'),
url(r'^(?P<pk>[0-9]+)/callback/$', JobTemplateCallback.as_view(), name='job_template_callback'),
url(r'^(?P<pk>[0-9]+)/schedules/$', JobTemplateSchedulesList.as_view(), name='job_template_schedules_list'),
url(r'^(?P<pk>[0-9]+)/survey_spec/$', JobTemplateSurveySpec.as_view(), name='job_template_survey_spec'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', JobTemplateActivityStreamList.as_view(), name='job_template_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', JobTemplateNotificationTemplatesAnyList.as_view(),
name='job_template_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', JobTemplateNotificationTemplatesErrorList.as_view(),
name='job_template_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', JobTemplateNotificationTemplatesSuccessList.as_view(),
name='job_template_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', JobTemplateInstanceGroupsList.as_view(), name='job_template_instance_groups_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', JobTemplateAccessList.as_view(), name='job_template_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', JobTemplateObjectRolesList.as_view(), name='job_template_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', JobTemplateLabelList.as_view(), name='job_template_label_list'),
]
__all__ = ['urls']

17
awx/api/urls/label.py Normal file
View File

@@ -0,0 +1,17 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
LabelList,
LabelDetail,
)
urls = [
url(r'^$', LabelList.as_view(), name='label_list'),
url(r'^(?P<pk>[0-9]+)/$', LabelDetail.as_view(), name='label_detail'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,17 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
NotificationList,
NotificationDetail,
)
urls = [
url(r'^$', NotificationList.as_view(), name='notification_list'),
url(r'^(?P<pk>[0-9]+)/$', NotificationDetail.as_view(), name='notification_detail'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
NotificationTemplateList,
NotificationTemplateDetail,
NotificationTemplateTest,
NotificationTemplateNotificationList,
)
urls = [
url(r'^$', NotificationTemplateList.as_view(), name='notification_template_list'),
url(r'^(?P<pk>[0-9]+)/$', NotificationTemplateDetail.as_view(), name='notification_template_detail'),
url(r'^(?P<pk>[0-9]+)/test/$', NotificationTemplateTest.as_view(), name='notification_template_test'),
url(r'^(?P<pk>[0-9]+)/notifications/$', NotificationTemplateNotificationList.as_view(), name='notification_template_notification_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,50 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
OrganizationList,
OrganizationDetail,
OrganizationUsersList,
OrganizationAdminsList,
OrganizationInventoriesList,
OrganizationProjectsList,
OrganizationWorkflowJobTemplatesList,
OrganizationTeamsList,
OrganizationCredentialList,
OrganizationActivityStreamList,
OrganizationNotificationTemplatesList,
OrganizationNotificationTemplatesAnyList,
OrganizationNotificationTemplatesErrorList,
OrganizationNotificationTemplatesSuccessList,
OrganizationInstanceGroupsList,
OrganizationObjectRolesList,
OrganizationAccessList,
)
urls = [
url(r'^$', OrganizationList.as_view(), name='organization_list'),
url(r'^(?P<pk>[0-9]+)/$', OrganizationDetail.as_view(), name='organization_detail'),
url(r'^(?P<pk>[0-9]+)/users/$', OrganizationUsersList.as_view(), name='organization_users_list'),
url(r'^(?P<pk>[0-9]+)/admins/$', OrganizationAdminsList.as_view(), name='organization_admins_list'),
url(r'^(?P<pk>[0-9]+)/inventories/$', OrganizationInventoriesList.as_view(), name='organization_inventories_list'),
url(r'^(?P<pk>[0-9]+)/projects/$', OrganizationProjectsList.as_view(), name='organization_projects_list'),
url(r'^(?P<pk>[0-9]+)/workflow_job_templates/$', OrganizationWorkflowJobTemplatesList.as_view(), name='organization_workflow_job_templates_list'),
url(r'^(?P<pk>[0-9]+)/teams/$', OrganizationTeamsList.as_view(), name='organization_teams_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', OrganizationCredentialList.as_view(), name='organization_credential_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', OrganizationActivityStreamList.as_view(), name='organization_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates/$', OrganizationNotificationTemplatesList.as_view(), name='organization_notification_templates_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', OrganizationNotificationTemplatesAnyList.as_view(),
name='organization_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', OrganizationNotificationTemplatesErrorList.as_view(),
name='organization_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', OrganizationNotificationTemplatesSuccessList.as_view(),
name='organization_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/instance_groups/$', OrganizationInstanceGroupsList.as_view(), name='organization_instance_groups_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', OrganizationObjectRolesList.as_view(), name='organization_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', OrganizationAccessList.as_view(), name='organization_access_list'),
]
__all__ = ['urls']

44
awx/api/urls/project.py Normal file
View File

@@ -0,0 +1,44 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
ProjectList,
ProjectDetail,
ProjectPlaybooks,
ProjectInventories,
ProjectScmInventorySources,
ProjectTeamsList,
ProjectUpdateView,
ProjectUpdatesList,
ProjectActivityStreamList,
ProjectSchedulesList,
ProjectNotificationTemplatesAnyList,
ProjectNotificationTemplatesErrorList,
ProjectNotificationTemplatesSuccessList,
ProjectObjectRolesList,
ProjectAccessList,
)
urls = [
url(r'^$', ProjectList.as_view(), name='project_list'),
url(r'^(?P<pk>[0-9]+)/$', ProjectDetail.as_view(), name='project_detail'),
url(r'^(?P<pk>[0-9]+)/playbooks/$', ProjectPlaybooks.as_view(), name='project_playbooks'),
url(r'^(?P<pk>[0-9]+)/inventories/$', ProjectInventories.as_view(), name='project_inventories'),
url(r'^(?P<pk>[0-9]+)/scm_inventory_sources/$', ProjectScmInventorySources.as_view(), name='project_scm_inventory_sources'),
url(r'^(?P<pk>[0-9]+)/teams/$', ProjectTeamsList.as_view(), name='project_teams_list'),
url(r'^(?P<pk>[0-9]+)/update/$', ProjectUpdateView.as_view(), name='project_update_view'),
url(r'^(?P<pk>[0-9]+)/project_updates/$', ProjectUpdatesList.as_view(), name='project_updates_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', ProjectActivityStreamList.as_view(), name='project_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/schedules/$', ProjectSchedulesList.as_view(), name='project_schedules_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', ProjectNotificationTemplatesAnyList.as_view(), name='project_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', ProjectNotificationTemplatesErrorList.as_view(), name='project_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', ProjectNotificationTemplatesSuccessList.as_view(),
name='project_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', ProjectObjectRolesList.as_view(), name='project_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', ProjectAccessList.as_view(), name='project_access_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,25 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
ProjectUpdateList,
ProjectUpdateDetail,
ProjectUpdateCancel,
ProjectUpdateStdout,
ProjectUpdateScmInventoryUpdates,
ProjectUpdateNotificationsList,
)
urls = [
url(r'^$', ProjectUpdateList.as_view(), name='project_update_list'),
url(r'^(?P<pk>[0-9]+)/$', ProjectUpdateDetail.as_view(), name='project_update_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', ProjectUpdateCancel.as_view(), name='project_update_cancel'),
url(r'^(?P<pk>[0-9]+)/stdout/$', ProjectUpdateStdout.as_view(), name='project_update_stdout'),
url(r'^(?P<pk>[0-9]+)/scm_inventory_updates/$', ProjectUpdateScmInventoryUpdates.as_view(), name='project_update_scm_inventory_updates'),
url(r'^(?P<pk>[0-9]+)/notifications/$', ProjectUpdateNotificationsList.as_view(), name='project_update_notifications_list'),
]
__all__ = ['urls']

25
awx/api/urls/role.py Normal file
View File

@@ -0,0 +1,25 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
RoleList,
RoleDetail,
RoleUsersList,
RoleTeamsList,
RoleParentsList,
RoleChildrenList,
)
urls = [
url(r'^$', RoleList.as_view(), name='role_list'),
url(r'^(?P<pk>[0-9]+)/$', RoleDetail.as_view(), name='role_detail'),
url(r'^(?P<pk>[0-9]+)/users/$', RoleUsersList.as_view(), name='role_users_list'),
url(r'^(?P<pk>[0-9]+)/teams/$', RoleTeamsList.as_view(), name='role_teams_list'),
url(r'^(?P<pk>[0-9]+)/parents/$', RoleParentsList.as_view(), name='role_parents_list'),
url(r'^(?P<pk>[0-9]+)/children/$', RoleChildrenList.as_view(), name='role_children_list'),
]
__all__ = ['urls']

21
awx/api/urls/schedule.py Normal file
View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
ScheduleList,
ScheduleDetail,
ScheduleUnifiedJobsList,
ScheduleCredentialsList,
)
urls = [
url(r'^$', ScheduleList.as_view(), name='schedule_list'),
url(r'^(?P<pk>[0-9]+)/$', ScheduleDetail.as_view(), name='schedule_detail'),
url(r'^(?P<pk>[0-9]+)/jobs/$', ScheduleUnifiedJobsList.as_view(), name='schedule_unified_jobs_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', ScheduleCredentialsList.as_view(), name='schedule_credentials_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
SystemJobList,
SystemJobDetail,
SystemJobCancel,
SystemJobNotificationsList,
)
urls = [
url(r'^$', SystemJobList.as_view(), name='system_job_list'),
url(r'^(?P<pk>[0-9]+)/$', SystemJobDetail.as_view(), name='system_job_detail'),
url(r'^(?P<pk>[0-9]+)/cancel/$', SystemJobCancel.as_view(), name='system_job_cancel'),
url(r'^(?P<pk>[0-9]+)/notifications/$', SystemJobNotificationsList.as_view(), name='system_job_notifications_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,32 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
SystemJobTemplateList,
SystemJobTemplateDetail,
SystemJobTemplateLaunch,
SystemJobTemplateJobsList,
SystemJobTemplateSchedulesList,
SystemJobTemplateNotificationTemplatesAnyList,
SystemJobTemplateNotificationTemplatesErrorList,
SystemJobTemplateNotificationTemplatesSuccessList,
)
urls = [
url(r'^$', SystemJobTemplateList.as_view(), name='system_job_template_list'),
url(r'^(?P<pk>[0-9]+)/$', SystemJobTemplateDetail.as_view(), name='system_job_template_detail'),
url(r'^(?P<pk>[0-9]+)/launch/$', SystemJobTemplateLaunch.as_view(), name='system_job_template_launch'),
url(r'^(?P<pk>[0-9]+)/jobs/$', SystemJobTemplateJobsList.as_view(), name='system_job_template_jobs_list'),
url(r'^(?P<pk>[0-9]+)/schedules/$', SystemJobTemplateSchedulesList.as_view(), name='system_job_template_schedules_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', SystemJobTemplateNotificationTemplatesAnyList.as_view(),
name='system_job_template_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', SystemJobTemplateNotificationTemplatesErrorList.as_view(),
name='system_job_template_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', SystemJobTemplateNotificationTemplatesSuccessList.as_view(),
name='system_job_template_notification_templates_success_list'),
]
__all__ = ['urls']

31
awx/api/urls/team.py Normal file
View File

@@ -0,0 +1,31 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
TeamList,
TeamDetail,
TeamProjectsList,
TeamUsersList,
TeamCredentialsList,
TeamRolesList,
TeamObjectRolesList,
TeamActivityStreamList,
TeamAccessList,
)
urls = [
url(r'^$', TeamList.as_view(), name='team_list'),
url(r'^(?P<pk>[0-9]+)/$', TeamDetail.as_view(), name='team_detail'),
url(r'^(?P<pk>[0-9]+)/projects/$', TeamProjectsList.as_view(), name='team_projects_list'),
url(r'^(?P<pk>[0-9]+)/users/$', TeamUsersList.as_view(), name='team_users_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', TeamCredentialsList.as_view(), name='team_credentials_list'),
url(r'^(?P<pk>[0-9]+)/roles/$', TeamRolesList.as_view(), name='team_roles_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', TeamObjectRolesList.as_view(), name='team_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', TeamActivityStreamList.as_view(), name='team_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', TeamAccessList.as_view(), name='team_access_list'),
]
__all__ = ['urls']

123
awx/api/urls/urls.py Normal file
View File

@@ -0,0 +1,123 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
from __future__ import absolute_import, unicode_literals
from django.conf.urls import include, url
from awx.api.views import (
ApiRootView,
ApiV1RootView,
ApiV2RootView,
ApiV1PingView,
ApiV1ConfigView,
AuthView,
AuthTokenView,
UserMeList,
DashboardView,
DashboardJobsGraphView,
UnifiedJobTemplateList,
UnifiedJobList,
HostAnsibleFactsDetail,
JobCredentialsList,
JobExtraCredentialsList,
JobTemplateCredentialsList,
JobTemplateExtraCredentialsList,
)
from .organization import urls as organization_urls
from .user import urls as user_urls
from .project import urls as project_urls
from .project_update import urls as project_update_urls
from .inventory import urls as inventory_urls
from .team import urls as team_urls
from .host import urls as host_urls
from .group import urls as group_urls
from .inventory_source import urls as inventory_source_urls
from .inventory_update import urls as inventory_update_urls
from .inventory_script import urls as inventory_script_urls
from .credential_type import urls as credential_type_urls
from .credential import urls as credential_urls
from .role import urls as role_urls
from .job_template import urls as job_template_urls
from .job import urls as job_urls
from .job_host_summary import urls as job_host_summary_urls
from .job_event import urls as job_event_urls
from .ad_hoc_command import urls as ad_hoc_command_urls
from .ad_hoc_command_event import urls as ad_hoc_command_event_urls
from .system_job_template import urls as system_job_template_urls
from .system_job import urls as system_job_urls
from .workflow_job_template import urls as workflow_job_template_urls
from .workflow_job import urls as workflow_job_urls
from .notification_template import urls as notification_template_urls
from .notification import urls as notification_urls
from .label import urls as label_urls
from .workflow_job_template_node import urls as workflow_job_template_node_urls
from .workflow_job_node import urls as workflow_job_node_urls
from .schedule import urls as schedule_urls
from .activity_stream import urls as activity_stream_urls
from .instance import urls as instance_urls
from .instance_group import urls as instance_group_urls
v1_urls = [
url(r'^$', ApiV1RootView.as_view(), name='api_v1_root_view'),
url(r'^ping/$', ApiV1PingView.as_view(), name='api_v1_ping_view'),
url(r'^config/$', ApiV1ConfigView.as_view(), name='api_v1_config_view'),
url(r'^auth/$', AuthView.as_view()),
url(r'^authtoken/$', AuthTokenView.as_view(), name='auth_token_view'),
url(r'^me/$', UserMeList.as_view(), name='user_me_list'),
url(r'^dashboard/$', DashboardView.as_view(), name='dashboard_view'),
url(r'^dashboard/graphs/jobs/$', DashboardJobsGraphView.as_view(), name='dashboard_jobs_graph_view'),
url(r'^settings/', include('awx.conf.urls')),
url(r'^instances/', include(instance_urls)),
url(r'^instance_groups/', include(instance_group_urls)),
url(r'^schedules/', include(schedule_urls)),
url(r'^organizations/', include(organization_urls)),
url(r'^users/', include(user_urls)),
url(r'^projects/', include(project_urls)),
url(r'^project_updates/', include(project_update_urls)),
url(r'^teams/', include(team_urls)),
url(r'^inventories/', include(inventory_urls)),
url(r'^hosts/', include(host_urls)),
url(r'^groups/', include(group_urls)),
url(r'^inventory_sources/', include(inventory_source_urls)),
url(r'^inventory_updates/', include(inventory_update_urls)),
url(r'^inventory_scripts/', include(inventory_script_urls)),
url(r'^credentials/', include(credential_urls)),
url(r'^roles/', include(role_urls)),
url(r'^job_templates/', include(job_template_urls)),
url(r'^jobs/', include(job_urls)),
url(r'^job_host_summaries/', include(job_host_summary_urls)),
url(r'^job_events/', include(job_event_urls)),
url(r'^ad_hoc_commands/', include(ad_hoc_command_urls)),
url(r'^ad_hoc_command_events/', include(ad_hoc_command_event_urls)),
url(r'^system_job_templates/', include(system_job_template_urls)),
url(r'^system_jobs/', include(system_job_urls)),
url(r'^notification_templates/', include(notification_template_urls)),
url(r'^notifications/', include(notification_urls)),
url(r'^workflow_job_templates/', include(workflow_job_template_urls)),
url(r'^workflow_jobs/', include(workflow_job_urls)),
url(r'^labels/', include(label_urls)),
url(r'^workflow_job_template_nodes/', include(workflow_job_template_node_urls)),
url(r'^workflow_job_nodes/', include(workflow_job_node_urls)),
url(r'^unified_job_templates/$', UnifiedJobTemplateList.as_view(), name='unified_job_template_list'),
url(r'^unified_jobs/$', UnifiedJobList.as_view(), name='unified_job_list'),
url(r'^activity_stream/', include(activity_stream_urls)),
]
v2_urls = [
url(r'^$', ApiV2RootView.as_view(), name='api_v2_root_view'),
url(r'^credential_types/', include(credential_type_urls)),
url(r'^hosts/(?P<pk>[0-9]+)/ansible_facts/$', HostAnsibleFactsDetail.as_view(), name='host_ansible_facts_detail'),
url(r'^jobs/(?P<pk>[0-9]+)/extra_credentials/$', JobExtraCredentialsList.as_view(), name='job_extra_credentials_list'),
url(r'^jobs/(?P<pk>[0-9]+)/credentials/$', JobCredentialsList.as_view(), name='job_credentials_list'),
url(r'^job_templates/(?P<pk>[0-9]+)/extra_credentials/$', JobTemplateExtraCredentialsList.as_view(), name='job_template_extra_credentials_list'),
url(r'^job_templates/(?P<pk>[0-9]+)/credentials/$', JobTemplateCredentialsList.as_view(), name='job_template_credentials_list'),
]
app_name = 'api'
urlpatterns = [
url(r'^$', ApiRootView.as_view(), name='api_root_view'),
url(r'^(?P<version>(v2))/', include(v2_urls)),
url(r'^(?P<version>(v1|v2))/', include(v1_urls))
]

33
awx/api/urls/user.py Normal file
View File

@@ -0,0 +1,33 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
UserList,
UserDetail,
UserTeamsList,
UserOrganizationsList,
UserAdminOfOrganizationsList,
UserProjectsList,
UserCredentialsList,
UserRolesList,
UserActivityStreamList,
UserAccessList,
)
urls = [
url(r'^$', UserList.as_view(), name='user_list'),
url(r'^(?P<pk>[0-9]+)/$', UserDetail.as_view(), name='user_detail'),
url(r'^(?P<pk>[0-9]+)/teams/$', UserTeamsList.as_view(), name='user_teams_list'),
url(r'^(?P<pk>[0-9]+)/organizations/$', UserOrganizationsList.as_view(), name='user_organizations_list'),
url(r'^(?P<pk>[0-9]+)/admin_of_organizations/$', UserAdminOfOrganizationsList.as_view(), name='user_admin_of_organizations_list'),
url(r'^(?P<pk>[0-9]+)/projects/$', UserProjectsList.as_view(), name='user_projects_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', UserCredentialsList.as_view(), name='user_credentials_list'),
url(r'^(?P<pk>[0-9]+)/roles/$', UserRolesList.as_view(), name='user_roles_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', UserActivityStreamList.as_view(), name='user_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', UserAccessList.as_view(), name='user_access_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,29 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
WorkflowJobList,
WorkflowJobDetail,
WorkflowJobWorkflowNodesList,
WorkflowJobLabelList,
WorkflowJobCancel,
WorkflowJobRelaunch,
WorkflowJobNotificationsList,
WorkflowJobActivityStreamList,
)
urls = [
url(r'^$', WorkflowJobList.as_view(), name='workflow_job_list'),
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobDetail.as_view(), name='workflow_job_detail'),
url(r'^(?P<pk>[0-9]+)/workflow_nodes/$', WorkflowJobWorkflowNodesList.as_view(), name='workflow_job_workflow_nodes_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', WorkflowJobLabelList.as_view(), name='workflow_job_label_list'),
url(r'^(?P<pk>[0-9]+)/cancel/$', WorkflowJobCancel.as_view(), name='workflow_job_cancel'),
url(r'^(?P<pk>[0-9]+)/relaunch/$', WorkflowJobRelaunch.as_view(), name='workflow_job_relaunch'),
url(r'^(?P<pk>[0-9]+)/notifications/$', WorkflowJobNotificationsList.as_view(), name='workflow_job_notifications_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', WorkflowJobActivityStreamList.as_view(), name='workflow_job_activity_stream_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,25 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
WorkflowJobNodeList,
WorkflowJobNodeDetail,
WorkflowJobNodeSuccessNodesList,
WorkflowJobNodeFailureNodesList,
WorkflowJobNodeAlwaysNodesList,
WorkflowJobNodeCredentialsList,
)
urls = [
url(r'^$', WorkflowJobNodeList.as_view(), name='workflow_job_node_list'),
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobNodeDetail.as_view(), name='workflow_job_node_detail'),
url(r'^(?P<pk>[0-9]+)/success_nodes/$', WorkflowJobNodeSuccessNodesList.as_view(), name='workflow_job_node_success_nodes_list'),
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobNodeFailureNodesList.as_view(), name='workflow_job_node_failure_nodes_list'),
url(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobNodeAlwaysNodesList.as_view(), name='workflow_job_node_always_nodes_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobNodeCredentialsList.as_view(), name='workflow_job_node_credentials_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,46 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
WorkflowJobTemplateList,
WorkflowJobTemplateDetail,
WorkflowJobTemplateJobsList,
WorkflowJobTemplateLaunch,
WorkflowJobTemplateCopy,
WorkflowJobTemplateSchedulesList,
WorkflowJobTemplateSurveySpec,
WorkflowJobTemplateWorkflowNodesList,
WorkflowJobTemplateActivityStreamList,
WorkflowJobTemplateNotificationTemplatesAnyList,
WorkflowJobTemplateNotificationTemplatesErrorList,
WorkflowJobTemplateNotificationTemplatesSuccessList,
WorkflowJobTemplateAccessList,
WorkflowJobTemplateObjectRolesList,
WorkflowJobTemplateLabelList,
)
urls = [
url(r'^$', WorkflowJobTemplateList.as_view(), name='workflow_job_template_list'),
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobTemplateDetail.as_view(), name='workflow_job_template_detail'),
url(r'^(?P<pk>[0-9]+)/workflow_jobs/$', WorkflowJobTemplateJobsList.as_view(), name='workflow_job_template_jobs_list'),
url(r'^(?P<pk>[0-9]+)/launch/$', WorkflowJobTemplateLaunch.as_view(), name='workflow_job_template_launch'),
url(r'^(?P<pk>[0-9]+)/copy/$', WorkflowJobTemplateCopy.as_view(), name='workflow_job_template_copy'),
url(r'^(?P<pk>[0-9]+)/schedules/$', WorkflowJobTemplateSchedulesList.as_view(), name='workflow_job_template_schedules_list'),
url(r'^(?P<pk>[0-9]+)/survey_spec/$', WorkflowJobTemplateSurveySpec.as_view(), name='workflow_job_template_survey_spec'),
url(r'^(?P<pk>[0-9]+)/workflow_nodes/$', WorkflowJobTemplateWorkflowNodesList.as_view(), name='workflow_job_template_workflow_nodes_list'),
url(r'^(?P<pk>[0-9]+)/activity_stream/$', WorkflowJobTemplateActivityStreamList.as_view(), name='workflow_job_template_activity_stream_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_any/$', WorkflowJobTemplateNotificationTemplatesAnyList.as_view(),
name='workflow_job_template_notification_templates_any_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', WorkflowJobTemplateNotificationTemplatesErrorList.as_view(),
name='workflow_job_template_notification_templates_error_list'),
url(r'^(?P<pk>[0-9]+)/notification_templates_success/$', WorkflowJobTemplateNotificationTemplatesSuccessList.as_view(),
name='workflow_job_template_notification_templates_success_list'),
url(r'^(?P<pk>[0-9]+)/access_list/$', WorkflowJobTemplateAccessList.as_view(), name='workflow_job_template_access_list'),
url(r'^(?P<pk>[0-9]+)/object_roles/$', WorkflowJobTemplateObjectRolesList.as_view(), name='workflow_job_template_object_roles_list'),
url(r'^(?P<pk>[0-9]+)/labels/$', WorkflowJobTemplateLabelList.as_view(), name='workflow_job_template_label_list'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,25 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
WorkflowJobTemplateNodeList,
WorkflowJobTemplateNodeDetail,
WorkflowJobTemplateNodeSuccessNodesList,
WorkflowJobTemplateNodeFailureNodesList,
WorkflowJobTemplateNodeAlwaysNodesList,
WorkflowJobTemplateNodeCredentialsList,
)
urls = [
url(r'^$', WorkflowJobTemplateNodeList.as_view(), name='workflow_job_template_node_list'),
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobTemplateNodeDetail.as_view(), name='workflow_job_template_node_detail'),
url(r'^(?P<pk>[0-9]+)/success_nodes/$', WorkflowJobTemplateNodeSuccessNodesList.as_view(), name='workflow_job_template_node_success_nodes_list'),
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobTemplateNodeFailureNodesList.as_view(), name='workflow_job_template_node_failure_nodes_list'),
url(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobTemplateNodeAlwaysNodesList.as_view(), name='workflow_job_template_node_always_nodes_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobTemplateNodeCredentialsList.as_view(), name='workflow_job_template_node_credentials_list'),
]
__all__ = ['urls']

View File

@@ -13,7 +13,7 @@ import sys
import logging
import requests
from base64 import b64encode
from collections import OrderedDict
from collections import OrderedDict, Iterable
# Django
from django.conf import settings
@@ -27,7 +27,6 @@ from django.utils.timezone import now
from django.views.decorators.csrf import csrf_exempt
from django.views.decorators.cache import never_cache
from django.template.loader import render_to_string
from django.core.servers.basehttp import FileWrapper
from django.http import HttpResponse
from django.contrib.contenttypes.models import ContentType
from django.utils.translation import ugettext_lazy as _
@@ -53,13 +52,15 @@ import qsstats
import ansiconv
# Python Social Auth
from social.backends.utils import load_backends
from social_core.backends.utils import load_backends
from wsgiref.util import FileWrapper
# AWX
from awx.main.tasks import send_notifications
from awx.main.access import get_user_queryset
from awx.main.ha import is_ha_environment
from awx.api.authentication import TaskAuthentication, TokenGetAuthentication
from awx.api.authentication import TokenGetAuthentication
from awx.api.filters import V1CredentialFilterBackend
from awx.api.generics import get_view_name
from awx.api.generics import * # noqa
@@ -68,7 +69,7 @@ from awx.conf.license import get_license, feature_enabled, feature_exists, Licen
from awx.main.models import * # noqa
from awx.main.utils import * # noqa
from awx.main.utils import (
callback_filter_out_ansible_extra_vars,
extract_ansible_vars,
decrypt_field,
)
from awx.main.utils.filters import SmartFilter
@@ -139,7 +140,8 @@ class UnifiedJobDeletionMixin(object):
raise PermissionDenied(detail=_('Cannot delete job resource when associated workflow job is running.'))
except self.model.unified_job_node.RelatedObjectDoesNotExist:
pass
if obj.status in ACTIVE_STATES:
# Still allow deletion of new status, because these can be manually created
if obj.status in ACTIVE_STATES and obj.status != 'new':
raise PermissionDenied(detail=_("Cannot delete running job resource."))
obj.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -359,7 +361,7 @@ class ApiV1ConfigView(APIView):
try:
settings.LICENSE = {}
return Response(status=status.HTTP_204_NO_CONTENT)
except:
except Exception:
# FIX: Log
return Response({"error": _("Failed to remove license (%s)") % has_error}, status=status.HTTP_400_BAD_REQUEST)
@@ -605,6 +607,46 @@ class ScheduleDetail(RetrieveUpdateDestroyAPIView):
new_in_148 = True
class LaunchConfigCredentialsBase(SubListAttachDetachAPIView):
model = Credential
serializer_class = CredentialSerializer
relationship = 'credentials'
def is_valid_relation(self, parent, sub, created=False):
if not parent.unified_job_template:
return {"msg": _("Cannot assign credential when related template is null.")}
ask_mapping = parent.unified_job_template.get_ask_mapping()
if self.relationship not in ask_mapping:
return {"msg": _("Related template cannot accept {} on launch.").format(self.relationship)}
elif sub.passwords_needed:
return {"msg": _("Credential that requires user input on launch "
"cannot be used in saved launch configuration.")}
ask_field_name = ask_mapping[self.relationship]
if not getattr(parent.unified_job_template, ask_field_name):
return {"msg": _("Related template is not configured to accept credentials on launch.")}
elif sub.unique_hash() in [cred.unique_hash() for cred in parent.credentials.all()]:
return {"msg": _("This launch configuration already provides a {credential_type} credential.").format(
credential_type=sub.unique_hash(display=True))}
elif sub.pk in parent.unified_job_template.credentials.values_list('pk', flat=True):
return {"msg": _("Related template already uses {credential_type} credential.").format(
credential_type=sub.name)}
# None means there were no validation errors
return None
class ScheduleCredentialsList(LaunchConfigCredentialsBase):
parent_model = Schedule
new_in_330 = True
new_in_api_v2 = True
class ScheduleUnifiedJobsList(SubListAPIView):
model = UnifiedJob
@@ -1299,8 +1341,11 @@ class ProjectUpdateView(RetrieveAPIView):
if not project_update:
return Response({}, status=status.HTTP_400_BAD_REQUEST)
else:
data = OrderedDict()
data['project_update'] = project_update.id
data.update(ProjectUpdateSerializer(project_update, context=self.get_serializer_context()).to_representation(project_update))
headers = {'Location': project_update.get_absolute_url(request=request)}
return Response({'project_update': project_update.id},
return Response(data,
headers=headers,
status=status.HTTP_202_ACCEPTED)
else:
@@ -1324,8 +1369,8 @@ class ProjectUpdateDetail(UnifiedJobDeletionMixin, RetrieveDestroyAPIView):
class ProjectUpdateCancel(RetrieveAPIView):
model = ProjectUpdate
obj_permission_type = 'cancel'
serializer_class = ProjectUpdateCancelSerializer
is_job_cancel = True
new_in_13 = True
def post(self, request, *args, **kwargs):
@@ -2118,7 +2163,9 @@ class HostInsights(GenericAPIView):
if res.status_code == 401:
return (dict(error=_('Unauthorized access. Please check your Insights Credential username and password.')), status.HTTP_502_BAD_GATEWAY)
elif res.status_code != 200:
return (dict(error=_('Failed to gather reports and maintenance plans from Insights API at URL {}. Server responded with {} status code and message {}').format(url, res.status_code, res.content)), status.HTTP_502_BAD_GATEWAY)
return (dict(error=_(
'Failed to gather reports and maintenance plans from Insights API at URL {}. Server responded with {} status code and message {}'
).format(url, res.status_code, res.content)), status.HTTP_502_BAD_GATEWAY)
try:
filtered_insights_content = filter_insights_api_response(res.json())
@@ -2366,80 +2413,23 @@ class InventoryScriptView(RetrieveAPIView):
model = Inventory
serializer_class = InventoryScriptSerializer
authentication_classes = [TaskAuthentication] + api_settings.DEFAULT_AUTHENTICATION_CLASSES
permission_classes = (TaskPermission,)
filter_backends = ()
def retrieve(self, request, *args, **kwargs):
obj = self.get_object()
hostname = request.query_params.get('host', '')
hostvars = bool(request.query_params.get('hostvars', ''))
show_all = bool(request.query_params.get('all', ''))
if show_all:
hosts_q = dict()
else:
hosts_q = dict(enabled=True)
if hostname:
host = get_object_or_404(obj.hosts, name=hostname, **hosts_q)
data = host.variables_dict
else:
data = dict()
if obj.variables_dict:
all_group = data.setdefault('all', dict())
all_group['vars'] = obj.variables_dict
if obj.kind == 'smart':
if len(obj.hosts.all()) == 0:
return Response({})
else:
all_group = data.setdefault('all', dict())
smart_hosts_qs = obj.hosts.all()
smart_hosts = list(smart_hosts_qs.values_list('name', flat=True))
all_group['hosts'] = smart_hosts
else:
# Add hosts without a group to the all group.
groupless_hosts_qs = obj.hosts.filter(groups__isnull=True, **hosts_q)
groupless_hosts = list(groupless_hosts_qs.values_list('name', flat=True))
if groupless_hosts:
all_group = data.setdefault('all', dict())
all_group['hosts'] = groupless_hosts
# Build in-memory mapping of groups and their hosts.
group_hosts_kw = dict(group__inventory_id=obj.id, host__inventory_id=obj.id)
if 'enabled' in hosts_q:
group_hosts_kw['host__enabled'] = hosts_q['enabled']
group_hosts_qs = Group.hosts.through.objects.filter(**group_hosts_kw)
group_hosts_qs = group_hosts_qs.values_list('group_id', 'host_id', 'host__name')
group_hosts_map = {}
for group_id, host_id, host_name in group_hosts_qs:
group_hostnames = group_hosts_map.setdefault(group_id, [])
group_hostnames.append(host_name)
# Build in-memory mapping of groups and their children.
group_parents_qs = Group.parents.through.objects.filter(
from_group__inventory_id=obj.id,
to_group__inventory_id=obj.id,
)
group_parents_qs = group_parents_qs.values_list('from_group_id', 'from_group__name', 'to_group_id')
group_children_map = {}
for from_group_id, from_group_name, to_group_id in group_parents_qs:
group_children = group_children_map.setdefault(to_group_id, [])
group_children.append(from_group_name)
# Now use in-memory maps to build up group info.
for group in obj.groups.all():
group_info = dict()
group_info['hosts'] = group_hosts_map.get(group.id, [])
group_info['children'] = group_children_map.get(group.id, [])
group_info['vars'] = group.variables_dict
data[group.name] = group_info
if hostvars:
data.setdefault('_meta', dict())
data['_meta'].setdefault('hostvars', dict())
for host in obj.hosts.filter(**hosts_q):
data['_meta']['hostvars'][host.name] = host.variables_dict
return Response(data)
hosts_q = dict(name=hostname)
if not show_all:
hosts_q['enabled'] = True
host = get_object_or_404(obj.hosts, **hosts_q)
return Response(host.variables_dict)
return Response(obj.get_script_data(
hostvars=bool(request.query_params.get('hostvars', '')),
show_all=show_all
))
class InventoryTreeView(RetrieveAPIView):
@@ -2492,9 +2482,9 @@ class InventoryInventorySourcesUpdate(RetrieveAPIView):
view_name = _('Inventory Sources Update')
model = Inventory
obj_permission_type = 'start'
serializer_class = InventorySourceUpdateSerializer
permission_classes = (InventoryInventorySourcesUpdatePermission,)
is_job_start = True
new_in_320 = True
def retrieve(self, request, *args, **kwargs):
@@ -2512,10 +2502,14 @@ class InventoryInventorySourcesUpdate(RetrieveAPIView):
successes = 0
failures = 0
for inventory_source in inventory.inventory_sources.exclude(source=''):
details = {'inventory_source': inventory_source.pk, 'status': None}
details = OrderedDict()
details['inventory_source'] = inventory_source.pk
details['status'] = None
if inventory_source.can_update:
update = inventory_source.update()
details.update(InventoryUpdateSerializer(update, context=self.get_serializer_context()).to_representation(update))
details['status'] = 'started'
details['inventory_update'] = inventory_source.update().id
details['inventory_update'] = update.id
successes += 1
else:
if not details.get('status'):
@@ -2644,8 +2638,8 @@ class InventorySourceUpdatesList(SubListAPIView):
class InventorySourceUpdateView(RetrieveAPIView):
model = InventorySource
obj_permission_type = 'start'
serializer_class = InventorySourceUpdateSerializer
is_job_start = True
new_in_14 = True
def post(self, request, *args, **kwargs):
@@ -2656,8 +2650,10 @@ class InventorySourceUpdateView(RetrieveAPIView):
return Response({}, status=status.HTTP_400_BAD_REQUEST)
else:
headers = {'Location': update.get_absolute_url(request=request)}
return Response(dict(inventory_update=update.id),
status=status.HTTP_202_ACCEPTED, headers=headers)
data = OrderedDict()
data['inventory_update'] = update.id
data.update(InventoryUpdateSerializer(update, context=self.get_serializer_context()).to_representation(update))
return Response(data, status=status.HTTP_202_ACCEPTED, headers=headers)
else:
return self.http_method_not_allowed(request, *args, **kwargs)
@@ -2667,12 +2663,6 @@ class InventoryUpdateList(ListAPIView):
model = InventoryUpdate
serializer_class = InventoryUpdateListSerializer
def get_queryset(self):
qs = super(InventoryUpdateList, self).get_queryset()
# TODO: remove this defer in 3.3 when we implement https://github.com/ansible/ansible-tower/issues/5436
qs = qs.defer('result_stdout_text')
return qs
class InventoryUpdateDetail(UnifiedJobDeletionMixin, RetrieveDestroyAPIView):
@@ -2684,8 +2674,8 @@ class InventoryUpdateDetail(UnifiedJobDeletionMixin, RetrieveDestroyAPIView):
class InventoryUpdateCancel(RetrieveAPIView):
model = InventoryUpdate
obj_permission_type = 'cancel'
serializer_class = InventoryUpdateCancelSerializer
is_job_cancel = True
new_in_14 = True
def post(self, request, *args, **kwargs):
@@ -2714,7 +2704,7 @@ class JobTemplateList(ListCreateAPIView):
always_allow_superuser = False
capabilities_prefetch = [
'admin', 'execute',
{'copy': ['project.use', 'inventory.use', 'credential.use', 'vault_credential.use']}
{'copy': ['project.use', 'inventory.use']}
]
def post(self, request, *args, **kwargs):
@@ -2733,12 +2723,12 @@ class JobTemplateDetail(RetrieveUpdateDestroyAPIView):
always_allow_superuser = False
class JobTemplateLaunch(RetrieveAPIView, GenericAPIView):
class JobTemplateLaunch(RetrieveAPIView):
model = JobTemplate
obj_permission_type = 'start'
metadata_class = JobTypeMetadata
serializer_class = JobLaunchSerializer
is_job_start = True
always_allow_superuser = False
def update_raw_data(self, data):
@@ -2748,68 +2738,124 @@ class JobTemplateLaunch(RetrieveAPIView, GenericAPIView):
return data
extra_vars = data.pop('extra_vars', None) or {}
if obj:
for p in obj.passwords_needed_to_start:
data[p] = u''
needed_passwords = obj.passwords_needed_to_start
if needed_passwords:
data['credential_passwords'] = {}
for p in needed_passwords:
data['credential_passwords'][p] = u''
else:
data.pop('credential_passwords')
for v in obj.variables_needed_to_start:
extra_vars.setdefault(v, u'')
if extra_vars:
data['extra_vars'] = extra_vars
ask_for_vars_dict = obj._ask_for_vars_dict()
ask_for_vars_dict.pop('extra_vars')
if get_request_version(self.request) == 1: # TODO: remove in 3.3
ask_for_vars_dict.pop('extra_credentials')
for field in ask_for_vars_dict:
if not ask_for_vars_dict[field]:
modified_ask_mapping = JobTemplate.get_ask_mapping()
modified_ask_mapping.pop('extra_vars')
for field, ask_field_name in modified_ask_mapping.items():
if not getattr(obj, ask_field_name):
data.pop(field, None)
elif field == 'inventory' or field == 'credential':
elif field == 'inventory':
data[field] = getattrd(obj, "%s.%s" % (field, 'id'), None)
elif field == 'extra_credentials':
data[field] = [cred.id for cred in obj.extra_credentials.all()]
elif field == 'credentials':
data[field] = [cred.id for cred in obj.credentials.all()]
else:
data[field] = getattr(obj, field)
return data
def post(self, request, *args, **kwargs):
obj = self.get_object()
def modernize_launch_payload(self, data, obj):
'''
Steps to do simple translations of request data to support
old field structure to launch endpoint
TODO: delete this method with future API version changes
'''
ignored_fields = {}
modern_data = data.copy()
for fd in ('credential', 'vault_credential', 'inventory'):
id_fd = '{}_id'.format(fd)
if fd not in request.data and id_fd in request.data:
request.data[fd] = request.data[id_fd]
if fd not in modern_data and id_fd in modern_data:
modern_data[fd] = modern_data[id_fd]
if get_request_version(self.request) == 1 and 'extra_credentials' in request.data: # TODO: remove in 3.3
if hasattr(request.data, '_mutable') and not request.data._mutable:
request.data._mutable = True
extra_creds = request.data.pop('extra_credentials', None)
# This block causes `extra_credentials` to _always_ be ignored for
# the launch endpoint if we're accessing `/api/v1/`
if get_request_version(self.request) == 1 and 'extra_credentials' in modern_data:
extra_creds = modern_data.pop('extra_credentials', None)
if extra_creds is not None:
ignored_fields['extra_credentials'] = extra_creds
passwords = {}
serializer = self.serializer_class(instance=obj, data=request.data, context={'obj': obj, 'data': request.data, 'passwords': passwords})
# Automatically convert legacy launch credential arguments into a list of `.credentials`
if 'credentials' in modern_data and (
'credential' in modern_data or
'vault_credential' in modern_data or
'extra_credentials' in modern_data
):
raise ParseError({"error": _(
"'credentials' cannot be used in combination with 'credential', 'vault_credential', or 'extra_credentials'."
)})
if (
'credential' in modern_data or
'vault_credential' in modern_data or
'extra_credentials' in modern_data
):
# make a list of the current credentials
existing_credentials = obj.credentials.all()
new_credentials = []
for key, conditional in (
('credential', lambda cred: cred.credential_type.kind != 'ssh'),
('vault_credential', lambda cred: cred.credential_type.kind != 'vault'),
('extra_credentials', lambda cred: cred.credential_type.kind not in ('cloud', 'net'))
):
if key in modern_data:
# if a specific deprecated key is specified, remove all
# credentials of _that_ type from the list of current
# credentials
existing_credentials = filter(conditional, existing_credentials)
prompted_value = modern_data.pop(key)
# add the deprecated credential specified in the request
if not isinstance(prompted_value, Iterable) or isinstance(prompted_value, basestring):
prompted_value = [prompted_value]
# If user gave extra_credentials, special case to use exactly
# the given list without merging with JT credentials
if key == 'extra_credentials' and prompted_value:
obj._deprecated_credential_launch = True # signal to not merge credentials
new_credentials.extend(prompted_value)
# combine the list of "new" and the filtered list of "old"
new_credentials.extend([cred.pk for cred in existing_credentials])
if new_credentials:
modern_data['credentials'] = new_credentials
# credential passwords were historically provided as top-level attributes
if 'credential_passwords' not in modern_data:
modern_data['credential_passwords'] = data.copy()
return (modern_data, ignored_fields)
def post(self, request, *args, **kwargs):
obj = self.get_object()
try:
modern_data, ignored_fields = self.modernize_launch_payload(
data=request.data, obj=obj
)
except ParseError as exc:
return Response(exc.detail, status=status.HTTP_400_BAD_REQUEST)
serializer = self.serializer_class(data=modern_data, context={'template': obj})
if not serializer.is_valid():
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
_accepted_or_ignored = obj._accept_or_ignore_job_kwargs(**request.data)
prompted_fields = _accepted_or_ignored[0]
ignored_fields.update(_accepted_or_ignored[1])
ignored_fields.update(serializer._ignored_fields)
for fd, model in (
('credential', Credential),
('vault_credential', Credential),
('inventory', Inventory)):
if fd in prompted_fields and prompted_fields[fd] != getattrd(obj, '{}.pk'.format(fd), None):
new_res = get_object_or_400(model, pk=get_pk_from_dict(prompted_fields, fd))
use_role = getattr(new_res, 'use_role')
if request.user not in use_role:
raise PermissionDenied()
if not request.user.can_access(JobLaunchConfig, 'add', serializer.validated_data, template=obj):
raise PermissionDenied()
for cred in prompted_fields.get('extra_credentials', []):
new_credential = get_object_or_400(Credential, pk=cred)
if request.user not in new_credential.use_role:
raise PermissionDenied()
new_job = obj.create_unified_job(**prompted_fields)
passwords = serializer.validated_data.pop('credential_passwords', {})
new_job = obj.create_unified_job(**serializer.validated_data)
result = new_job.signal_start(**passwords)
if not result:
@@ -2818,12 +2864,36 @@ class JobTemplateLaunch(RetrieveAPIView, GenericAPIView):
return Response(data, status=status.HTTP_400_BAD_REQUEST)
else:
data = OrderedDict()
data['ignored_fields'] = ignored_fields
data.update(JobSerializer(new_job, context=self.get_serializer_context()).to_representation(new_job))
data['job'] = new_job.id
data['ignored_fields'] = self.sanitize_for_response(ignored_fields)
data.update(JobSerializer(new_job, context=self.get_serializer_context()).to_representation(new_job))
return Response(data, status=status.HTTP_201_CREATED)
def sanitize_for_response(self, data):
'''
Model objects cannot be serialized by DRF,
this replaces objects with their ids for inclusion in response
'''
def display_value(val):
if hasattr(val, 'id'):
return val.id
else:
return val
sanitized_data = {}
for field_name, value in data.items():
if isinstance(value, (set, list)):
sanitized_data[field_name] = []
for sub_value in value:
sanitized_data[field_name].append(display_value(sub_value))
else:
sanitized_data[field_name] = display_value(value)
return sanitized_data
class JobTemplateSchedulesList(SubListCreateAPIView):
view_name = _("Job Template Schedules")
@@ -2839,6 +2909,7 @@ class JobTemplateSchedulesList(SubListCreateAPIView):
class JobTemplateSurveySpec(GenericAPIView):
model = JobTemplate
obj_permission_type = 'admin'
serializer_class = EmptySerializer
new_in_210 = True
@@ -2925,7 +2996,6 @@ class JobTemplateSurveySpec(GenericAPIView):
class WorkflowJobTemplateSurveySpec(WorkflowsEnforcementMixin, JobTemplateSurveySpec):
model = WorkflowJobTemplate
parent_model = WorkflowJobTemplate
new_in_310 = True
@@ -2965,17 +3035,17 @@ class JobTemplateNotificationTemplatesSuccessList(SubListCreateAttachDetachAPIVi
new_in_300 = True
class JobTemplateExtraCredentialsList(SubListCreateAttachDetachAPIView):
class JobTemplateCredentialsList(SubListCreateAttachDetachAPIView):
model = Credential
serializer_class = CredentialSerializer
parent_model = JobTemplate
relationship = 'extra_credentials'
new_in_320 = True
relationship = 'credentials'
new_in_330 = True
new_in_api_v2 = True
def get_queryset(self):
# Return the full list of extra_credentials
# Return the full list of credentials
parent = self.get_parent_object()
self.check_parent_access(parent)
sublist_qs = getattrd(parent, self.relationship)
@@ -2986,15 +3056,29 @@ class JobTemplateExtraCredentialsList(SubListCreateAttachDetachAPIView):
return sublist_qs
def is_valid_relation(self, parent, sub, created=False):
current_extra_types = [
cred.credential_type.pk for cred in parent.extra_credentials.all()
]
if sub.credential_type.pk in current_extra_types:
return {'error': _('Cannot assign multiple %s credentials.' % sub.credential_type.name)}
if sub.unique_hash() in [cred.unique_hash() for cred in parent.credentials.all()]:
return {"error": _("Cannot assign multiple {credential_type} credentials.".format(
credential_type=sub.unique_hash(display=True)))}
if sub.credential_type.kind not in ('net', 'cloud'):
return super(JobTemplateCredentialsList, self).is_valid_relation(parent, sub, created)
class JobTemplateExtraCredentialsList(JobTemplateCredentialsList):
deprecated = True
new_in_320 = True
new_in_330 = False
def get_queryset(self):
sublist_qs = super(JobTemplateExtraCredentialsList, self).get_queryset()
sublist_qs = sublist_qs.filter(credential_type__kind__in=['cloud', 'net'])
return sublist_qs
def is_valid_relation(self, parent, sub, created=False):
valid = super(JobTemplateExtraCredentialsList, self).is_valid_relation(parent, sub, created)
if sub.credential_type.kind not in ('cloud', 'net'):
return {'error': _('Extra credentials must be network or cloud.')}
return super(JobTemplateExtraCredentialsList, self).is_valid_relation(parent, sub, created)
return valid
class JobTemplateLabelList(DeleteLastUnattachLabelMixin, SubListCreateAttachDetachAPIView):
@@ -3158,7 +3242,8 @@ class JobTemplateCallback(GenericAPIView):
# Everything is fine; actually create the job.
kv = {"limit": limit, "launch_type": 'callback'}
if extra_vars is not None and job_template.ask_variables_on_launch:
kv['extra_vars'] = callback_filter_out_ansible_extra_vars(extra_vars)
extra_vars_redacted, removed = extract_ansible_vars(extra_vars)
kv['extra_vars'] = extra_vars_redacted
with transaction.atomic():
job = job_template.create_job(**kv)
@@ -3225,10 +3310,20 @@ class WorkflowJobNodeDetail(WorkflowsEnforcementMixin, RetrieveAPIView):
new_in_310 = True
class WorkflowJobNodeCredentialsList(SubListAPIView):
model = Credential
serializer_class = CredentialSerializer
parent_model = WorkflowJobNode
relationship = 'credentials'
new_in_330 = True
new_in_api_v2 = True
class WorkflowJobTemplateNodeList(WorkflowsEnforcementMixin, ListCreateAPIView):
model = WorkflowJobTemplateNode
serializer_class = WorkflowJobTemplateNodeListSerializer
serializer_class = WorkflowJobTemplateNodeSerializer
new_in_310 = True
@@ -3238,21 +3333,18 @@ class WorkflowJobTemplateNodeDetail(WorkflowsEnforcementMixin, RetrieveUpdateDes
serializer_class = WorkflowJobTemplateNodeDetailSerializer
new_in_310 = True
def update_raw_data(self, data):
for fd in ['job_type', 'job_tags', 'skip_tags', 'limit', 'skip_tags']:
data[fd] = None
try:
obj = self.get_object()
data.update(obj.char_prompts)
except:
pass
return super(WorkflowJobTemplateNodeDetail, self).update_raw_data(data)
class WorkflowJobTemplateNodeCredentialsList(LaunchConfigCredentialsBase):
parent_model = WorkflowJobTemplateNode
new_in_330 = True
new_in_api_v2 = True
class WorkflowJobTemplateNodeChildrenBaseList(WorkflowsEnforcementMixin, EnforceParentRelationshipMixin, SubListCreateAttachDetachAPIView):
model = WorkflowJobTemplateNode
serializer_class = WorkflowJobTemplateNodeListSerializer
serializer_class = WorkflowJobTemplateNodeSerializer
always_allow_superuser = True
parent_model = WorkflowJobTemplateNode
relationship = ''
@@ -3409,9 +3501,9 @@ class WorkflowJobTemplateLaunch(WorkflowsEnforcementMixin, RetrieveAPIView):
model = WorkflowJobTemplate
obj_permission_type = 'start'
serializer_class = WorkflowJobLaunchSerializer
new_in_310 = True
is_job_start = True
always_allow_superuser = False
def update_raw_data(self, data):
@@ -3429,30 +3521,28 @@ class WorkflowJobTemplateLaunch(WorkflowsEnforcementMixin, RetrieveAPIView):
def post(self, request, *args, **kwargs):
obj = self.get_object()
if not request.user.can_access(self.model, 'start', obj):
raise PermissionDenied()
serializer = self.serializer_class(instance=obj, data=request.data)
if not serializer.is_valid():
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
prompted_fields, ignored_fields = obj._accept_or_ignore_job_kwargs(**request.data)
prompted_fields, ignored_fields, errors = obj._accept_or_ignore_job_kwargs(**request.data)
new_job = obj.create_unified_job(**prompted_fields)
new_job.signal_start()
data = OrderedDict()
data['workflow_job'] = new_job.id
data['ignored_fields'] = ignored_fields
data.update(WorkflowJobSerializer(new_job, context=self.get_serializer_context()).to_representation(new_job))
data['workflow_job'] = new_job.id
return Response(data, status=status.HTTP_201_CREATED)
class WorkflowJobRelaunch(WorkflowsEnforcementMixin, GenericAPIView):
model = WorkflowJob
obj_permission_type = 'start'
serializer_class = EmptySerializer
is_job_start = True
new_in_310 = True
def check_object_permissions(self, request, obj):
@@ -3478,17 +3568,12 @@ class WorkflowJobRelaunch(WorkflowsEnforcementMixin, GenericAPIView):
class WorkflowJobTemplateWorkflowNodesList(WorkflowsEnforcementMixin, SubListCreateAPIView):
model = WorkflowJobTemplateNode
serializer_class = WorkflowJobTemplateNodeListSerializer
serializer_class = WorkflowJobTemplateNodeSerializer
parent_model = WorkflowJobTemplate
relationship = 'workflow_job_template_nodes'
parent_key = 'workflow_job_template'
new_in_310 = True
def update_raw_data(self, data):
for fd in ['job_type', 'job_tags', 'skip_tags', 'limit', 'skip_tags']:
data[fd] = None
return super(WorkflowJobTemplateWorkflowNodesList, self).update_raw_data(data)
def get_queryset(self):
return super(WorkflowJobTemplateWorkflowNodesList, self).get_queryset().order_by('id')
@@ -3609,8 +3694,8 @@ class WorkflowJobWorkflowNodesList(WorkflowsEnforcementMixin, SubListAPIView):
class WorkflowJobCancel(WorkflowsEnforcementMixin, RetrieveAPIView):
model = WorkflowJob
obj_permission_type = 'cancel'
serializer_class = WorkflowJobCancelSerializer
is_job_cancel = True
new_in_310 = True
def post(self, request, *args, **kwargs):
@@ -3664,8 +3749,8 @@ class SystemJobTemplateDetail(RetrieveAPIView):
class SystemJobTemplateLaunch(GenericAPIView):
model = SystemJobTemplate
obj_permission_type = 'start'
serializer_class = EmptySerializer
is_job_start = True
new_in_210 = True
def get(self, request, *args, **kwargs):
@@ -3676,7 +3761,9 @@ class SystemJobTemplateLaunch(GenericAPIView):
new_job = obj.create_unified_job(extra_vars=request.data.get('extra_vars', {}))
new_job.signal_start()
data = dict(system_job=new_job.id)
data = OrderedDict()
data['system_job'] = new_job.id
data.update(SystemJobSerializer(new_job, context=self.get_serializer_context()).to_representation(new_job))
return Response(data, status=status.HTTP_201_CREATED)
@@ -3764,14 +3851,26 @@ class JobDetail(UnifiedJobDeletionMixin, RetrieveUpdateDestroyAPIView):
return super(JobDetail, self).update(request, *args, **kwargs)
class JobExtraCredentialsList(SubListAPIView):
class JobCredentialsList(SubListAPIView):
model = Credential
serializer_class = CredentialSerializer
parent_model = Job
relationship = 'extra_credentials'
new_in_320 = True
relationship = 'credentials'
new_in_api_v2 = True
new_in_330 = True
class JobExtraCredentialsList(JobCredentialsList):
deprecated = True
new_in_320 = True
new_in_330 = False
def get_queryset(self):
sublist_qs = super(JobExtraCredentialsList, self).get_queryset()
sublist_qs = sublist_qs.filter(credential_type__kind__in=['cloud', 'net'])
return sublist_qs
class JobLabelList(SubListAPIView):
@@ -3802,8 +3901,8 @@ class JobActivityStreamList(ActivityStreamEnforcementMixin, SubListAPIView):
class JobStart(GenericAPIView):
model = Job
obj_permission_type = 'start'
serializer_class = EmptySerializer
is_job_start = True
deprecated = True
def v2_not_allowed(self):
@@ -3840,8 +3939,8 @@ class JobStart(GenericAPIView):
class JobCancel(RetrieveAPIView):
model = Job
obj_permission_type = 'cancel'
serializer_class = JobCancelSerializer
is_job_cancel = True
def post(self, request, *args, **kwargs):
obj = self.get_object()
@@ -3852,11 +3951,11 @@ class JobCancel(RetrieveAPIView):
return self.http_method_not_allowed(request, *args, **kwargs)
class JobRelaunch(RetrieveAPIView, GenericAPIView):
class JobRelaunch(RetrieveAPIView):
model = Job
obj_permission_type = 'start'
serializer_class = JobRelaunchSerializer
is_job_start = True
@csrf_exempt
@transaction.non_atomic_requests
@@ -3879,7 +3978,26 @@ class JobRelaunch(RetrieveAPIView, GenericAPIView):
if not serializer.is_valid():
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
new_job = obj.copy_unified_job()
copy_kwargs = {}
retry_hosts = request.data.get('hosts', None)
if retry_hosts and retry_hosts != 'all':
if obj.status in ACTIVE_STATES:
return Response({'hosts': _(
'Wait until job finishes before retrying on {status_value} hosts.'
).format(status_value=retry_hosts)}, status=status.HTTP_400_BAD_REQUEST)
host_qs = obj.retry_qs(retry_hosts)
if not obj.job_events.filter(event='playbook_on_stats').exists():
return Response({'hosts': _(
'Cannot retry on {status_value} hosts, playbook stats not available.'
).format(status_value=retry_hosts)}, status=status.HTTP_400_BAD_REQUEST)
retry_host_list = host_qs.values_list('name', flat=True)
if len(retry_host_list) == 0:
return Response({'hosts': _(
'Cannot relaunch because previous job had 0 {status_value} hosts.'
).format(status_value=retry_hosts)}, status=status.HTTP_400_BAD_REQUEST)
copy_kwargs['limit'] = ','.join(retry_host_list)
new_job = obj.copy_unified_job(**copy_kwargs)
result = new_job.signal_start(**request.data)
if not result:
data = dict(passwords_needed_to_start=new_job.passwords_needed_to_start)
@@ -3892,6 +4010,52 @@ class JobRelaunch(RetrieveAPIView, GenericAPIView):
return Response(data, status=status.HTTP_201_CREATED, headers=headers)
class JobCreateSchedule(RetrieveAPIView):
model = Job
obj_permission_type = 'start'
serializer_class = JobCreateScheduleSerializer
new_in_330 = True
def post(self, request, *args, **kwargs):
obj = self.get_object()
if not obj.can_schedule:
return Response({"error": _('Information needed to schedule this job is missing.')},
status=status.HTTP_400_BAD_REQUEST)
config = obj.launch_config
if not request.user.can_access(JobLaunchConfig, 'add', {'reference_obj': obj}):
raise PermissionDenied()
# Make up a name for the schedule, guarentee that it is unique
name = 'Auto-generated schedule from job {}'.format(obj.id)
existing_names = Schedule.objects.filter(name__startswith=name).values_list('name', flat=True)
if name in existing_names:
idx = 1
alt_name = '{} - number {}'.format(name, idx)
while alt_name in existing_names:
idx += 1
alt_name = '{} - number {}'.format(name, idx)
name = alt_name
schedule = Schedule.objects.create(
name=name,
unified_job_template=obj.unified_job_template,
enabled=False,
rrule='{}Z RRULE:FREQ=MONTHLY;INTERVAL=1'.format(now().strftime('DTSTART:%Y%m%dT%H%M%S')),
extra_data=config.extra_data,
survey_passwords=config.survey_passwords,
inventory=config.inventory,
char_prompts=config.char_prompts
)
schedule.credentials.add(*config.credentials.all())
data = ScheduleSerializer(schedule, context=self.get_serializer_context()).data
headers = {'Location': schedule.get_absolute_url(request=request)}
return Response(data, status=status.HTTP_201_CREATED, headers=headers)
class JobNotificationsList(SubListAPIView):
model = Notification
@@ -4096,8 +4260,8 @@ class AdHocCommandDetail(UnifiedJobDeletionMixin, RetrieveDestroyAPIView):
class AdHocCommandCancel(RetrieveAPIView):
model = AdHocCommand
obj_permission_type = 'cancel'
serializer_class = AdHocCommandCancelSerializer
is_job_cancel = True
new_in_220 = True
def post(self, request, *args, **kwargs):
@@ -4112,8 +4276,8 @@ class AdHocCommandCancel(RetrieveAPIView):
class AdHocCommandRelaunch(GenericAPIView):
model = AdHocCommand
obj_permission_type = 'start'
serializer_class = AdHocCommandRelaunchSerializer
is_job_start = True
new_in_220 = True
# FIXME: Figure out why OPTIONS request still shows all fields.
@@ -4247,8 +4411,8 @@ class SystemJobDetail(UnifiedJobDeletionMixin, RetrieveDestroyAPIView):
class SystemJobCancel(RetrieveAPIView):
model = SystemJob
obj_permission_type = 'cancel'
serializer_class = SystemJobCancelSerializer
is_job_cancel = True
new_in_210 = True
def post(self, request, *args, **kwargs):
@@ -4277,7 +4441,6 @@ class UnifiedJobTemplateList(ListAPIView):
capabilities_prefetch = [
'admin', 'execute',
{'copy': ['jobtemplate.project.use', 'jobtemplate.inventory.use',
'jobtemplate.credential.use', 'jobtemplate.vault_credential.use',
'workflowjobtemplate.organization.admin']}
]
@@ -4469,9 +4632,9 @@ class NotificationTemplateTest(GenericAPIView):
view_name = _('Notification Template Test')
model = NotificationTemplate
obj_permission_type = 'start'
serializer_class = EmptySerializer
new_in_300 = True
is_job_start = True
def post(self, request, *args, **kwargs):
obj = self.get_object()
@@ -4481,8 +4644,11 @@ class NotificationTemplateTest(GenericAPIView):
return Response({}, status=status.HTTP_400_BAD_REQUEST)
else:
send_notifications.delay([notification.id])
data = OrderedDict()
data['notification'] = notification.id
data.update(NotificationSerializer(notification, context=self.get_serializer_context()).to_representation(notification))
headers = {'Location': notification.get_absolute_url(request=request)}
return Response({"notification": notification.id},
return Response(data,
headers=headers,
status=status.HTTP_202_ACCEPTED)

23
awx/celery.py Normal file
View File

@@ -0,0 +1,23 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
try:
import awx.devonly # noqa
MODE = 'development'
except ImportError: # pragma: no cover
MODE = 'production'
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'awx.settings.%s' % MODE)
app = Celery('awx')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
if __name__ == '__main__':
app.start()

View File

@@ -53,6 +53,47 @@ class StringListField(ListField):
return super(StringListField, self).to_representation(value)
class StringListBooleanField(ListField):
default_error_messages = {
'type_error': _('Expected None, True, False, a string or list of strings but got {input_type} instead.'),
}
child = CharField()
def to_representation(self, value):
try:
if isinstance(value, (list, tuple)):
return super(StringListBooleanField, self).to_representation(value)
elif value in NullBooleanField.TRUE_VALUES:
return True
elif value in NullBooleanField.FALSE_VALUES:
return False
elif value in NullBooleanField.NULL_VALUES:
return None
elif isinstance(value, basestring):
return self.child.to_representation(value)
except TypeError:
pass
self.fail('type_error', input_type=type(value))
def to_internal_value(self, data):
try:
if isinstance(data, (list, tuple)):
return super(StringListBooleanField, self).to_internal_value(data)
elif data in NullBooleanField.TRUE_VALUES:
return True
elif data in NullBooleanField.FALSE_VALUES:
return False
elif data in NullBooleanField.NULL_VALUES:
return None
elif isinstance(data, basestring):
return self.child.run_validation(data)
except TypeError:
pass
self.fail('type_error', input_type=type(data))
class URLField(CharField):
def __init__(self, **kwargs):
@@ -83,7 +124,7 @@ class URLField(CharField):
else:
netloc = '{}@{}' % (url_parts.username, netloc)
value = urlparse.urlunsplit([url_parts.scheme, netloc, url_parts.path, url_parts.query, url_parts.fragment])
except:
except Exception:
raise # If something fails here, just fall through and let the validators check it.
super(URLField, self).run_validators(value)
@@ -100,3 +141,25 @@ class KeyValueField(DictField):
if not isinstance(value, six.string_types + six.integer_types + (float,)):
self.fail('invalid_child', input=value)
return ret
class ListTuplesField(ListField):
default_error_messages = {
'type_error': _('Expected a list of tuples of max length 2 but got {input_type} instead.'),
}
def to_representation(self, value):
if isinstance(value, (list, tuple)):
return super(ListTuplesField, self).to_representation(value)
else:
self.fail('type_error', input_type=type(value))
def to_internal_value(self, data):
if isinstance(data, list):
for x in data:
if not isinstance(x, (list, tuple)) or len(x) > 2:
self.fail('type_error', input_type=type(x))
return super(ListTuplesField, self).to_internal_value(data)
else:
self.fail('type_error', input_type=type(data))

View File

@@ -159,14 +159,14 @@ class SettingsRegistry(object):
if category_slug == 'user' and for_user:
try:
field_instance.default = original_field_instance.to_representation(getattr(self.settings, setting))
except:
except Exception:
logger.warning('Unable to retrieve default value for user setting "%s".', setting, exc_info=True)
elif not field_instance.read_only or field_instance.default is empty or field_instance.defined_in_file:
try:
field_instance.default = original_field_instance.to_representation(self.settings._awx_conf_settings._get_default(setting))
except AttributeError:
pass
except:
except Exception:
logger.warning('Unable to retrieve default value for setting "%s".', setting, exc_info=True)
# `PENDO_TRACKING_STATE` is disabled for the open source awx license

View File

@@ -16,7 +16,7 @@ class SettingSerializer(BaseSerializer):
class Meta:
model = Setting
fields = ('id', 'key', 'value')
readonly_fields = ('id', 'key', 'value')
read_only_fields = ('id', 'key', 'value')
def __init__(self, instance=None, data=serializers.empty, **kwargs):
if instance is None and data is not serializers.empty and 'key' in data:

View File

@@ -9,6 +9,7 @@ import time
import six
# Django
from django.conf import LazySettings
from django.conf import settings, UserSettingsHolder
from django.core.cache import cache as django_cache
from django.core.exceptions import ImproperlyConfigured
@@ -366,7 +367,7 @@ class SettingsWrapper(UserSettingsHolder):
return internal_value
else:
return field.run_validation(value)
except:
except Exception:
logger.warning(
'The current value "%r" for setting "%s" is invalid.',
value, name, exc_info=True)
@@ -458,3 +459,19 @@ class SettingsWrapper(UserSettingsHolder):
set_locally = Setting.objects.filter(key=setting, user__isnull=True).exists()
set_on_default = getattr(self.default_settings, 'is_overridden', lambda s: False)(setting)
return (set_locally or set_on_default)
def __getattr_without_cache__(self, name):
# Django 1.10 added an optimization to settings lookup:
# https://code.djangoproject.com/ticket/27625
# https://github.com/django/django/commit/c1b221a9b913315998a1bcec2f29a9361a74d1ac
# This change caches settings lookups on the __dict__ of the LazySettings
# object, which is not okay to do in an environment where settings can
# change in-process (the entire point of awx's custom settings implementation)
# This restores the original behavior that *does not* cache.
if self._wrapped is empty:
self._setup(name)
return getattr(self._wrapped, name)
LazySettings.__getattr__ = __getattr_without_cache__

View File

@@ -0,0 +1,86 @@
import pytest
from rest_framework.fields import ValidationError
from awx.conf.fields import StringListBooleanField, ListTuplesField
class TestStringListBooleanField():
FIELD_VALUES = [
("hello", "hello"),
(("a", "b"), ["a", "b"]),
(["a", "b", 1, 3.13, "foo", "bar", "foobar"], ["a", "b", "1", "3.13", "foo", "bar", "foobar"]),
("True", True),
("TRUE", True),
("true", True),
(True, True),
("False", False),
("FALSE", False),
("false", False),
(False, False),
("", None),
("null", None),
("NULL", None),
]
FIELD_VALUES_INVALID = [
1.245,
{"a": "b"},
]
@pytest.mark.parametrize("value_in, value_known", FIELD_VALUES)
def test_to_internal_value_valid(self, value_in, value_known):
field = StringListBooleanField()
v = field.to_internal_value(value_in)
assert v == value_known
@pytest.mark.parametrize("value", FIELD_VALUES_INVALID)
def test_to_internal_value_invalid(self, value):
field = StringListBooleanField()
with pytest.raises(ValidationError) as e:
field.to_internal_value(value)
assert e.value.detail[0] == "Expected None, True, False, a string or list " \
"of strings but got {} instead.".format(type(value))
@pytest.mark.parametrize("value_in, value_known", FIELD_VALUES)
def test_to_representation_valid(self, value_in, value_known):
field = StringListBooleanField()
v = field.to_representation(value_in)
assert v == value_known
@pytest.mark.parametrize("value", FIELD_VALUES_INVALID)
def test_to_representation_invalid(self, value):
field = StringListBooleanField()
with pytest.raises(ValidationError) as e:
field.to_representation(value)
assert e.value.detail[0] == "Expected None, True, False, a string or list " \
"of strings but got {} instead.".format(type(value))
class TestListTuplesField():
FIELD_VALUES = [
([('a', 'b'), ('abc', '123')], [("a", "b"), ("abc", "123")]),
]
FIELD_VALUES_INVALID = [
("abc", type("abc")),
([('a', 'b', 'c'), ('abc', '123', '456')], type(('a',))),
(['a', 'b'], type('a')),
(123, type(123)),
]
@pytest.mark.parametrize("value_in, value_known", FIELD_VALUES)
def test_to_internal_value_valid(self, value_in, value_known):
field = ListTuplesField()
v = field.to_internal_value(value_in)
assert v == value_known
@pytest.mark.parametrize("value, t", FIELD_VALUES_INVALID)
def test_to_internal_value_invalid(self, value, t):
field = ListTuplesField()
with pytest.raises(ValidationError) as e:
field.to_internal_value(value)
assert e.value.detail[0] == "Expected a list of tuples of max length 2 " \
"but got {} instead.".format(t)

View File

@@ -1,16 +1,17 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved.
# Django
from django.conf.urls import patterns
# Tower
from awx.api.urls import url
urlpatterns = patterns(
'awx.conf.views',
url(r'^$', 'setting_category_list'),
url(r'^(?P<category_slug>[a-z0-9-]+)/$', 'setting_singleton_detail'),
url(r'^logging/test/$', 'setting_logging_test'),
from django.conf.urls import url
from awx.conf.views import (
SettingCategoryList,
SettingSingletonDetail,
SettingLoggingTest,
)
urlpatterns = [
url(r'^$', SettingCategoryList.as_view(), name='setting_category_list'),
url(r'^(?P<category_slug>[a-z0-9-]+)/$', SettingSingletonDetail.as_view(), name='setting_singleton_detail'),
url(r'^logging/test/$', SettingLoggingTest.as_view(), name='setting_logging_test'),
]

View File

@@ -16,7 +16,7 @@ class argv_placeholder(object):
def __del__(self):
try:
argv_ready(sys.argv)
except:
except Exception:
pass

View File

@@ -1,3 +1,8 @@
# Copyright (c) 2017 Ansible by Red Hat
# All Rights Reserved
from __future__ import absolute_import
from collections import OrderedDict
import json
import mock
@@ -23,9 +28,9 @@ with mock.patch.dict(os.environ, {'ANSIBLE_STDOUT_CALLBACK': CALLBACK,
'ANSIBLE_CALLBACK_PLUGINS': PLUGINS}):
from ansible.cli.playbook import PlaybookCLI
from ansible.executor.playbook_executor import PlaybookExecutor
from ansible.inventory import Inventory
from ansible.inventory.manager import InventoryManager
from ansible.parsing.dataloader import DataLoader
from ansible.vars import VariableManager
from ansible.vars.manager import VariableManager
# Add awx/lib to sys.path so we can use the plugin
path = os.path.abspath(os.path.join(PLUGINS, '..', '..'))
@@ -62,9 +67,8 @@ def executor(tmpdir_factory, request):
cli.parse()
options = cli.parser.parse_args(['-v'])[0]
loader = DataLoader()
variable_manager = VariableManager()
inventory = Inventory(loader=loader, variable_manager=variable_manager,
host_list=['localhost'])
variable_manager = VariableManager(loader=loader)
inventory = InventoryManager(loader=loader, sources='localhost,')
variable_manager.set_inventory(inventory)
return PlaybookExecutor(playbooks=playbook_files, inventory=inventory,

File diff suppressed because it is too large Load Diff

View File

@@ -70,14 +70,9 @@ register(
label=_('Remote Host Headers'),
help_text=_('HTTP headers and meta keys to search to determine remote host '
'name or IP. Add additional items to this list, such as '
'"HTTP_X_FORWARDED_FOR", if behind a reverse proxy.\n\n'
'Note: The headers will be searched in order and the first '
'found remote host name or IP will be used.\n\n'
'In the below example 8.8.8.7 would be the chosen IP address.\n'
'X-Forwarded-For: 8.8.8.7, 192.168.2.1, 127.0.0.1\n'
'Host: 127.0.0.1\n'
'REMOTE_HOST_HEADERS = [\'HTTP_X_FORWARDED_FOR\', '
'\'REMOTE_ADDR\', \'REMOTE_HOST\']'),
'"HTTP_X_FORWARDED_FOR", if behind a reverse proxy. '
'See the "Proxy Support" section of the Adminstrator guide for'
'more details.'),
category=_('System'),
category_slug='system',
)
@@ -88,9 +83,7 @@ register(
label=_('Proxy IP Whitelist'),
help_text=_("If Tower is behind a reverse proxy/load balancer, use this setting "
"to whitelist the proxy IP addresses from which Tower should trust "
"custom REMOTE_HOST_HEADERS header values\n"
"REMOTE_HOST_HEADERS = ['HTTP_X_FORWARDED_FOR', ''REMOTE_ADDR', 'REMOTE_HOST']\n"
"PROXY_IP_WHITELIST = ['10.0.1.100', '10.0.1.101']\n"
"custom REMOTE_HOST_HEADERS header values. "
"If this setting is an empty list (the default), the headers specified by "
"REMOTE_HOST_HEADERS will be trusted unconditionally')"),
category=_('System'),
@@ -105,7 +98,7 @@ def _load_default_license_from_file():
license_data = json.load(open(license_file))
logger.debug('Read license data from "%s".', license_file)
return license_data
except:
except Exception:
logger.warning('Could not read license from "%s".', license_file, exc_info=True)
return {}
@@ -268,7 +261,8 @@ register(
field_class=fields.IntegerField,
min_value=0,
label=_('Job Event Standard Output Maximum Display Size'),
help_text=_(u'Maximum Size of Standard Output in bytes to display for a single job or ad hoc command event. `stdout` will end with `\u2026` when truncated.'),
help_text=_(
u'Maximum Size of Standard Output in bytes to display for a single job or ad hoc command event. `stdout` will end with `\u2026` when truncated.'),
category=_('Jobs'),
category_slug='jobs',
)

View File

@@ -7,5 +7,7 @@ from django.utils.translation import ugettext_lazy as _
CLOUD_PROVIDERS = ('azure_rm', 'ec2', 'gce', 'vmware', 'openstack', 'satellite6', 'cloudforms')
SCHEDULEABLE_PROVIDERS = CLOUD_PROVIDERS + ('custom', 'scm',)
PRIVILEGE_ESCALATION_METHODS = [ ('sudo', _('Sudo')), ('su', _('Su')), ('pbrun', _('Pbrun')), ('pfexec', _('Pfexec')), ('dzdo', _('DZDO')), ('pmrun', _('Pmrun')), ('runas', _('Runas'))]
PRIVILEGE_ESCALATION_METHODS = [
('sudo', _('Sudo')), ('su', _('Su')), ('pbrun', _('Pbrun')), ('pfexec', _('Pfexec')),
('dzdo', _('DZDO')), ('pmrun', _('Pmrun')), ('runas', _('Runas'))]
ANSI_SGR_PATTERN = re.compile(r'\x1b\[[0-9;]*m')

View File

@@ -24,7 +24,7 @@ def discard_groups(message):
@channel_session
def ws_connect(message):
connect_text = {'accept':False, 'user':None}
message.reply_channel.send({"accept": True})
message.content['method'] = 'FAKE'
request = AsgiRequest(message)
@@ -35,11 +35,12 @@ def ws_connect(message):
auth_token = AuthToken.objects.get(key=token)
if auth_token.in_valid_tokens:
message.channel_session['user_id'] = auth_token.user_id
connect_text['accept'] = True
connect_text['user'] = auth_token.user_id
message.reply_channel.send({"text": json.dumps({"accept": True, "user": auth_token.user_id})})
return None
except AuthToken.DoesNotExist:
logger.error("auth_token provided was invalid.")
message.reply_channel.send({"text": json.dumps(connect_text)})
message.reply_channel.send({"close": True})
return None
@channel_session
@@ -81,7 +82,8 @@ def ws_receive(message):
if access_cls is not None:
user_access = access_cls(user)
if not user_access.get_queryset().filter(pk=oid).exists():
message.reply_channel.send({"text": json.dumps({"error": "access denied to channel {0} for resource id {1}".format(group_name, oid)})})
message.reply_channel.send({"text": json.dumps(
{"error": "access denied to channel {0} for resource id {1}".format(group_name, oid)})})
continue
current_groups.add(name)
Group(name).add(message.reply_channel)

View File

@@ -9,6 +9,7 @@ import stat
import tempfile
import time
import logging
from distutils.version import LooseVersion as Version
from django.conf import settings
@@ -370,7 +371,24 @@ class IsolatedManager(object):
logger.warning('Isolated job {} cleanup error, output:\n{}'.format(self.instance.id, output))
@classmethod
def health_check(cls, instance_qs):
def update_capacity(cls, instance, task_result, awx_application_version):
instance.version = task_result['version']
isolated_version = instance.version.split("-", 1)[0]
cluster_version = awx_application_version.split("-", 1)[0]
if Version(cluster_version) > Version(isolated_version):
err_template = "Isolated instance {} reports version {}, cluster node is at {}, setting capacity to zero."
logger.error(err_template.format(instance.hostname, instance.version, awx_application_version))
instance.capacity = 0
else:
if instance.capacity == 0 and task_result['capacity']:
logger.warning('Isolated instance {} has re-joined.'.format(instance.hostname))
instance.capacity = int(task_result['capacity'])
instance.save(update_fields=['capacity', 'version', 'modified'])
@classmethod
def health_check(cls, instance_qs, awx_application_version):
'''
:param instance_qs: List of Django objects representing the
isolated instances to manage
@@ -412,11 +430,7 @@ class IsolatedManager(object):
except (KeyError, IndexError):
task_result = {}
if 'capacity' in task_result:
instance.version = task_result['version']
if instance.capacity == 0 and task_result['capacity']:
logger.warning('Isolated instance {} has re-joined.'.format(instance.hostname))
instance.capacity = int(task_result['capacity'])
instance.save(update_fields=['capacity', 'version', 'modified'])
cls.update_capacity(instance, task_result, awx_application_version)
elif instance.capacity == 0:
logger.debug('Isolated instance {} previously marked as lost, could not re-join.'.format(
instance.hostname))

View File

@@ -122,7 +122,7 @@ def run_pexpect(args, cwd, env, logfile,
if cancelled_callback:
try:
canceled = cancelled_callback()
except:
except Exception:
logger.exception('Could not check cancel callback - canceling immediately')
if isinstance(extra_update_fields, dict):
extra_update_fields['job_explanation'] = "System error during job execution, check system logs"
@@ -271,12 +271,8 @@ def __run__(private_data_dir):
if __name__ == '__main__':
__version__ = '3.2.0'
try:
import awx
__version__ = awx.__version__
except ImportError:
pass # in devel, `awx` isn't an installed package
import awx
__version__ = awx.__version__
parser = argparse.ArgumentParser(description='manage a daemonized, isolated ansible playbook')
parser.add_argument('--version', action='version', version=__version__ + '-isolated')
parser.add_argument('command', choices=['start', 'stop', 'is-alive'])

View File

@@ -18,12 +18,12 @@ from django.db.models.signals import (
)
from django.db.models.signals import m2m_changed
from django.db import models
from django.db.models.fields.related import (
add_lazy_relation,
SingleRelatedObjectDescriptor,
ReverseSingleRelatedObjectDescriptor,
ManyRelatedObjectsDescriptor,
ReverseManyRelatedObjectsDescriptor,
from django.db.models.fields.related import add_lazy_relation
from django.db.models.fields.related_descriptors import (
ReverseOneToOneDescriptor,
ForwardManyToOneDescriptor,
ManyToManyDescriptor,
ReverseManyToOneDescriptor,
)
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
@@ -96,7 +96,7 @@ class JSONBField(upstream_JSONBField):
# https://bitbucket.org/offline/django-annoying/src/a0de8b294db3/annoying/fields.py
class AutoSingleRelatedObjectDescriptor(SingleRelatedObjectDescriptor):
class AutoSingleRelatedObjectDescriptor(ReverseOneToOneDescriptor):
"""Descriptor for access to the object from its related class."""
def __get__(self, instance, instance_type=None):
@@ -139,7 +139,7 @@ def resolve_role_field(obj, field):
raise Exception(smart_text('{} refers to a {}, not a Role'.format(field, type(obj))))
ret.append(obj.id)
else:
if type(obj) is ManyRelatedObjectsDescriptor:
if type(obj) is ManyToManyDescriptor:
for o in obj.all():
ret += resolve_role_field(o, field_components[1])
else:
@@ -179,7 +179,7 @@ def is_implicit_parent(parent_role, child_role):
return False
class ImplicitRoleDescriptor(ReverseSingleRelatedObjectDescriptor):
class ImplicitRoleDescriptor(ForwardManyToOneDescriptor):
pass
@@ -230,18 +230,18 @@ class ImplicitRoleField(models.ForeignKey):
field_name, sep, field_attr = field_name.partition('.')
field = getattr(cls, field_name)
if type(field) is ReverseManyRelatedObjectsDescriptor or \
type(field) is ManyRelatedObjectsDescriptor:
if type(field) is ReverseManyToOneDescriptor or \
type(field) is ManyToManyDescriptor:
if '.' in field_attr:
raise Exception('Referencing deep roles through ManyToMany fields is unsupported.')
if type(field) is ReverseManyRelatedObjectsDescriptor:
if type(field) is ReverseManyToOneDescriptor:
sender = field.through
else:
sender = field.related.through
reverse = type(field) is ManyRelatedObjectsDescriptor
reverse = type(field) is ManyToManyDescriptor
m2m_changed.connect(self.m2m_update(field_attr, reverse), sender, weak=False)
def m2m_update(self, field_attr, _reverse):
@@ -415,6 +415,13 @@ class JSONSchemaField(JSONBField):
return value
@JSONSchemaField.format_checker.checks('vault_id')
def format_vault_id(value):
if '@' in value:
raise jsonschema.exceptions.FormatError('@ is not an allowed character')
return True
@JSONSchemaField.format_checker.checks('ssh_private_key')
def format_ssh_private_key(value):
# Sanity check: GCE, in particular, provides JSON-encoded private
@@ -754,3 +761,22 @@ class CredentialTypeInjectorField(JSONSchemaField):
code='invalid',
params={'value': value},
)
class AskForField(models.BooleanField):
"""
Denotes whether to prompt on launch for another field on the same template
"""
def __init__(self, allows_field=None, **kwargs):
super(AskForField, self).__init__(**kwargs)
self._allows_field = allows_field
@property
def allows_field(self):
if self._allows_field is None:
try:
return self.name[len('ask_'):-len('_on_launch')]
except AttributeError:
# self.name will be set by the model metaclass, not this field
raise Exception('Corresponding allows_field cannot be accessed until model is initialized.')
return self._allows_field

View File

@@ -2,12 +2,12 @@
# All Rights Reserved
from awx.main.utils import get_licenser
from django.core.management.base import NoArgsCommand
from django.core.management.base import BaseCommand
class Command(NoArgsCommand):
class Command(BaseCommand):
"""Returns license type, e.g., 'enterprise', 'open', 'none'"""
def handle(self, **options):
def handle(self, *args, **options):
super(Command, self).__init__()
return get_licenser().validate().get('license_type', 'none')

View File

@@ -4,29 +4,28 @@
# Python
import datetime
import logging
from optparse import make_option
# Django
from django.core.management.base import NoArgsCommand
from django.core.management.base import BaseCommand
from django.utils.timezone import now
# AWX
from awx.main.models import ActivityStream
class Command(NoArgsCommand):
class Command(BaseCommand):
'''
Management command to purge old activity stream events.
'''
help = 'Remove old activity stream events from the database'
option_list = NoArgsCommand.option_list + (
make_option('--days', dest='days', type='int', default=90, metavar='N',
help='Remove activity stream events more than N days old'),
make_option('--dry-run', dest='dry_run', action='store_true',
default=False, help='Dry run mode (show items that would '
'be removed)'),)
def add_arguments(self, parser):
parser.add_argument('--days', dest='days', type=int, default=90, metavar='N',
help='Remove activity stream events more than N days old')
parser.add_argument('--dry-run', dest='dry_run', action='store_true',
default=False, help='Dry run mode (show items that would '
'be removed)')
def init_logging(self):
log_levels = dict(enumerate([logging.ERROR, logging.INFO,
@@ -61,7 +60,7 @@ class Command(NoArgsCommand):
n_deleted_items += len(pks_to_delete)
self.logger.log(99, "Removed %d items", n_deleted_items)
def handle_noargs(self, **options):
def handle(self, *args, **options):
self.verbosity = int(options.get('verbosity', 1))
self.init_logging()
self.days = int(options.get('days', 30))

View File

@@ -4,7 +4,6 @@
# Python
import re
from dateutil.relativedelta import relativedelta
from optparse import make_option
# Django
from django.core.management.base import BaseCommand, CommandError
@@ -93,19 +92,20 @@ class CleanupFacts(object):
class Command(BaseCommand):
help = 'Cleanup facts. For each host older than the value specified, keep one fact scan for each time window (granularity).'
option_list = BaseCommand.option_list + (
make_option('--older_than',
dest='older_than',
default='30d',
help='Specify the relative time to consider facts older than (w)eek (d)ay or (y)ear (i.e. 5d, 2w, 1y). Defaults to 30d.'),
make_option('--granularity',
dest='granularity',
default='1w',
help='Window duration to group same hosts by for deletion (w)eek (d)ay or (y)ear (i.e. 5d, 2w, 1y). Defaults to 1w.'),
make_option('--module',
dest='module',
default=None,
help='Limit cleanup to a particular module.'),)
def add_arguments(self, parser):
parser.add_argument('--older_than',
dest='older_than',
default='30d',
help='Specify the relative time to consider facts older than (w)eek (d)ay or (y)ear (i.e. 5d, 2w, 1y). Defaults to 30d.')
parser.add_argument('--granularity',
dest='granularity',
default='1w',
help='Window duration to group same hosts by for deletion (w)eek (d)ay or (y)ear (i.e. 5d, 2w, 1y). Defaults to 1w.')
parser.add_argument('--module',
dest='module',
default=None,
help='Limit cleanup to a particular module.')
def __init__(self):
super(Command, self).__init__()

View File

@@ -4,10 +4,9 @@
# Python
import datetime
import logging
from optparse import make_option
# Django
from django.core.management.base import NoArgsCommand, CommandError
from django.core.management.base import BaseCommand, CommandError
from django.db import transaction
from django.utils.timezone import now
@@ -25,41 +24,40 @@ from awx.main.signals import ( # noqa
from django.db.models.signals import post_save, post_delete, m2m_changed # noqa
class Command(NoArgsCommand):
class Command(BaseCommand):
'''
Management command to cleanup old jobs and project updates.
'''
help = 'Remove old jobs, project and inventory updates from the database.'
option_list = NoArgsCommand.option_list + (
make_option('--days', dest='days', type='int', default=90, metavar='N',
help='Remove jobs/updates executed more than N days ago. Defaults to 90.'),
make_option('--dry-run', dest='dry_run', action='store_true',
default=False, help='Dry run mode (show items that would '
'be removed)'),
make_option('--jobs', dest='only_jobs', action='store_true',
default=False,
help='Remove jobs'),
make_option('--ad-hoc-commands', dest='only_ad_hoc_commands',
action='store_true', default=False,
help='Remove ad hoc commands'),
make_option('--project-updates', dest='only_project_updates',
action='store_true', default=False,
help='Remove project updates'),
make_option('--inventory-updates', dest='only_inventory_updates',
action='store_true', default=False,
help='Remove inventory updates'),
make_option('--management-jobs', default=False,
action='store_true', dest='only_management_jobs',
help='Remove management jobs'),
make_option('--notifications', dest='only_notifications',
action='store_true', default=False,
help='Remove notifications'),
make_option('--workflow-jobs', default=False,
action='store_true', dest='only_workflow_jobs',
help='Remove workflow jobs')
)
def add_arguments(self, parser):
parser.add_argument('--days', dest='days', type=int, default=90, metavar='N',
help='Remove jobs/updates executed more than N days ago. Defaults to 90.')
parser.add_argument('--dry-run', dest='dry_run', action='store_true',
default=False, help='Dry run mode (show items that would '
'be removed)')
parser.add_argument('--jobs', dest='only_jobs', action='store_true',
default=False,
help='Remove jobs')
parser.add_argument('--ad-hoc-commands', dest='only_ad_hoc_commands',
action='store_true', default=False,
help='Remove ad hoc commands')
parser.add_argument('--project-updates', dest='only_project_updates',
action='store_true', default=False,
help='Remove project updates')
parser.add_argument('--inventory-updates', dest='only_inventory_updates',
action='store_true', default=False,
help='Remove inventory updates')
parser.add_argument('--management-jobs', default=False,
action='store_true', dest='only_management_jobs',
help='Remove management jobs')
parser.add_argument('--notifications', dest='only_notifications',
action='store_true', default=False,
help='Remove notifications')
parser.add_argument('--workflow-jobs', default=False,
action='store_true', dest='only_workflow_jobs',
help='Remove workflow jobs')
def cleanup_jobs(self):
#jobs_qs = Job.objects.exclude(status__in=('pending', 'running'))
@@ -223,7 +221,7 @@ class Command(NoArgsCommand):
return skipped, deleted
@transaction.atomic
def handle_noargs(self, **options):
def handle(self, *args, **options):
self.verbosity = int(options.get('verbosity', 1))
self.init_logging()
self.days = int(options.get('days', 90))

View File

@@ -45,10 +45,10 @@ class Command(BaseCommand):
inventory=i,
variables="ansible_connection: local",
created_by=superuser)
JobTemplate.objects.create(name='Demo Job Template',
playbook='hello_world.yml',
project=p,
inventory=i,
credential=c)
jt = JobTemplate.objects.create(name='Demo Job Template',
playbook='hello_world.yml',
project=p,
inventory=i)
jt.credentials.add(c)
print('Default organization added.')
print('Demo Credential, Inventory, and Job Template added.')

View File

@@ -1,7 +1,6 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved
from optparse import make_option
import subprocess
import warnings
@@ -22,12 +21,11 @@ class Command(BaseCommand):
'Specify `--hostname` to use this command.'
)
option_list = BaseCommand.option_list + (
make_option('--hostname', dest='hostname', type='string',
help='Hostname used during provisioning'),
make_option('--name', dest='name', type='string',
help='(PENDING DEPRECIATION) Hostname used during provisioning'),
)
def add_arguments(self, parser):
parser.add_argument('--hostname', dest='hostname', type=str,
help='Hostname used during provisioning')
parser.add_argument('--name', dest='name', type=str,
help='(PENDING DEPRECIATION) Hostname used during provisioning')
@transaction.atomic
def handle(self, *args, **options):

View File

@@ -4,7 +4,6 @@
# Python
import json
import logging
from optparse import make_option
import os
import re
import subprocess
@@ -15,7 +14,7 @@ import shutil
# Django
from django.conf import settings
from django.core.management.base import NoArgsCommand, CommandError
from django.core.management.base import BaseCommand, CommandError
from django.core.exceptions import ImproperlyConfigured
from django.db import connection, transaction
from django.utils.encoding import smart_text
@@ -86,10 +85,8 @@ class AnsibleInventoryLoader(object):
env['ANSIBLE_INVENTORY_UNPARSED_FAILED'] = '1'
venv_libdir = os.path.join(settings.ANSIBLE_VENV_PATH, "lib")
env.pop('PYTHONPATH', None) # default to none if no python_ver matches
for python_ver in ["python2.7", "python2.6"]:
if os.path.isdir(os.path.join(venv_libdir, python_ver)):
env['PYTHONPATH'] = os.path.join(venv_libdir, python_ver, "site-packages") + ":"
break
if os.path.isdir(os.path.join(venv_libdir, "python2.7")):
env['PYTHONPATH'] = os.path.join(venv_libdir, "python2.7", "site-packages") + ":"
return env
def get_base_args(self):
@@ -168,7 +165,7 @@ class AnsibleInventoryLoader(object):
data = json.loads(stdout)
if not isinstance(data, dict):
raise TypeError('Returned JSON must be a dictionary, got %s instead' % str(type(data)))
except:
except Exception:
logger.error('Failed to load JSON from: %s', stdout)
raise
return data
@@ -251,7 +248,7 @@ def load_inventory_source(source, group_filter_re=None,
return inventory.all_group
class Command(NoArgsCommand):
class Command(BaseCommand):
'''
Management command to import inventory from a directory, ini file, or
dynamic inventory script.
@@ -259,50 +256,46 @@ class Command(NoArgsCommand):
help = 'Import or sync external inventory sources'
option_list = NoArgsCommand.option_list + (
make_option('--inventory-name', dest='inventory_name', type='str',
default=None, metavar='n',
help='name of inventory to sync'),
make_option('--inventory-id', dest='inventory_id', type='int',
default=None, metavar='i', help='id of inventory to sync'),
make_option('--overwrite', dest='overwrite', action='store_true',
metavar="o", default=False,
help='overwrite the destination hosts and groups'),
make_option('--overwrite-vars', dest='overwrite_vars',
action='store_true', metavar="V", default=False,
help='overwrite (rather than merge) variables'),
make_option('--keep-vars', dest='keep_vars', action='store_true',
metavar="k", default=False,
help='use database variables if set'),
make_option('--custom', dest='custom', action='store_true',
metavar="c", default=False,
help='this is a custom inventory script'),
make_option('--source', dest='source', type='str', default=None,
metavar='s', help='inventory directory, file, or script '
'to load'),
make_option('--enabled-var', dest='enabled_var', type='str',
default=None, metavar='v', help='host variable used to '
'set/clear enabled flag when host is online/offline, may '
'be specified as "foo.bar" to traverse nested dicts.'),
make_option('--enabled-value', dest='enabled_value', type='str',
default=None, metavar='v', help='value of host variable '
'specified by --enabled-var that indicates host is '
'enabled/online.'),
make_option('--group-filter', dest='group_filter', type='str',
default=None, metavar='regex', help='regular expression '
'to filter group name(s); only matches are imported.'),
make_option('--host-filter', dest='host_filter', type='str',
default=None, metavar='regex', help='regular expression '
'to filter host name(s); only matches are imported.'),
make_option('--exclude-empty-groups', dest='exclude_empty_groups',
action='store_true', default=False, help='when set, '
'exclude all groups that have no child groups, hosts, or '
'variables.'),
make_option('--instance-id-var', dest='instance_id_var', type='str',
default=None, metavar='v', help='host variable that '
'specifies the unique, immutable instance ID, may be '
'specified as "foo.bar" to traverse nested dicts.'),
)
def add_arguments(self, parser):
parser.add_argument('--inventory-name', dest='inventory_name',
type=str, default=None, metavar='n',
help='name of inventory to sync')
parser.add_argument('--inventory-id', dest='inventory_id', type=int,
default=None, metavar='i',
help='id of inventory to sync')
parser.add_argument('--overwrite', dest='overwrite', action='store_true', default=False,
help='overwrite the destination hosts and groups')
parser.add_argument('--overwrite-vars', dest='overwrite_vars',
action='store_true', default=False,
help='overwrite (rather than merge) variables')
parser.add_argument('--keep-vars', dest='keep_vars', action='store_true', default=False,
help='use database variables if set')
parser.add_argument('--custom', dest='custom', action='store_true', default=False,
help='this is a custom inventory script')
parser.add_argument('--source', dest='source', type=str, default=None,
metavar='s', help='inventory directory, file, or script to load')
parser.add_argument('--enabled-var', dest='enabled_var', type=str,
default=None, metavar='v', help='host variable used to '
'set/clear enabled flag when host is online/offline, may '
'be specified as "foo.bar" to traverse nested dicts.')
parser.add_argument('--enabled-value', dest='enabled_value', type=str,
default=None, metavar='v', help='value of host variable '
'specified by --enabled-var that indicates host is '
'enabled/online.')
parser.add_argument('--group-filter', dest='group_filter', type=str,
default=None, metavar='regex', help='regular expression '
'to filter group name(s); only matches are imported.')
parser.add_argument('--host-filter', dest='host_filter', type=str,
default=None, metavar='regex', help='regular expression '
'to filter host name(s); only matches are imported.')
parser.add_argument('--exclude-empty-groups', dest='exclude_empty_groups',
action='store_true', default=False, help='when set, '
'exclude all groups that have no child groups, hosts, or '
'variables.')
parser.add_argument('--instance-id-var', dest='instance_id_var', type=str,
default=None, metavar='v', help='host variable that '
'specifies the unique, immutable instance ID, may be '
'specified as "foo.bar" to traverse nested dicts.')
def set_logging_level(self):
log_levels = dict(enumerate([logging.WARNING, logging.INFO,
@@ -352,7 +345,12 @@ class Command(NoArgsCommand):
enabled = bool(unicode(enabled_value) == unicode(enabled))
else:
enabled = bool(enabled)
return enabled
if enabled is default:
return None
elif isinstance(enabled, bool):
return enabled
else:
raise NotImplementedError('Value of enabled {} not understood.'.format(enabled))
def load_inventory_from_database(self):
'''
@@ -400,10 +398,10 @@ class Command(NoArgsCommand):
overwrite_vars=self.overwrite_vars,
)
self.inventory_update = self.inventory_source.create_inventory_update(
job_args=json.dumps(sys.argv),
job_env=dict(os.environ.items()),
job_cwd=os.getcwd(),
_eager_fields=dict(
job_args=json.dumps(sys.argv),
job_env=dict(os.environ.items()),
job_cwd=os.getcwd(),
execution_node=settings.CLUSTER_HOST_ID,
instance_group=InstanceGroup.objects.get(name='tower'))
)
@@ -927,7 +925,7 @@ class Command(NoArgsCommand):
self.inventory_update.license_error = True
self.inventory_update.save(update_fields=['license_error'])
def handle_noargs(self, **options):
def handle(self, *args, **options):
self.verbosity = int(options.get('verbosity', 1))
self.set_logging_level()
self.inventory_name = options.get('inventory_name', None)

View File

@@ -2,14 +2,14 @@
# All Rights Reserved
from awx.main.models import Instance, InstanceGroup
from django.core.management.base import NoArgsCommand
from django.core.management.base import BaseCommand
class Command(NoArgsCommand):
class Command(BaseCommand):
"""List instances from the Tower database
"""
def handle(self, **options):
def handle(self, *args, **options):
super(Command, self).__init__()
for instance in Instance.objects.all():

View File

@@ -5,7 +5,6 @@ from awx.main.models import Instance
from awx.main.utils.pglock import advisory_lock
from django.conf import settings
from optparse import make_option
from django.db import transaction
from django.core.management.base import BaseCommand, CommandError
@@ -21,10 +20,9 @@ class Command(BaseCommand):
'Specify `--hostname` to use this command.'
)
option_list = BaseCommand.option_list + (
make_option('--hostname', dest='hostname', type='string',
help='Hostname used during provisioning'),
)
def add_arguments(self, parser):
parser.add_argument('--hostname', dest='hostname', type=str,
help='Hostname used during provisioning')
def _register_hostname(self, hostname):
if not hostname:

View File

@@ -5,20 +5,18 @@ import sys
from awx.main.utils.pglock import advisory_lock
from awx.main.models import Instance, InstanceGroup
from optparse import make_option
from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
option_list = BaseCommand.option_list + (
make_option('--queuename', dest='queuename', type='string',
help='Queue to create/update'),
make_option('--hostnames', dest='hostnames', type='string',
help='Comma-Delimited Hosts to add to the Queue'),
make_option('--controller', dest='controller', type='string', default='',
help='The controlling group (makes this an isolated group)'),
)
def add_arguments(self, parser):
parser.add_argument('--queuename', dest='queuename', type=str,
help='Queue to create/update')
parser.add_argument('--hostnames', dest='hostnames', type=str,
help='Comma-Delimited Hosts to add to the Queue')
parser.add_argument('--controller', dest='controller', type=str,
default='', help='The controlling group (makes this an isolated group)')
def handle(self, **options):
queuename = options.get('queuename')

View File

@@ -3,8 +3,6 @@
import sys
from awx.main.models import Instance, InstanceGroup
from optparse import make_option
from django.core.management.base import BaseCommand, CommandError
@@ -14,14 +12,13 @@ class Command(BaseCommand):
"Remove an instance (specified by --hostname) from the specified queue (instance group).\n"
"In order remove the queue, use the `unregister_queue` command.")
option_list = BaseCommand.option_list + (
make_option('--queuename', dest='queuename', type='string',
help='Queue to be removed from'),
make_option('--hostname', dest='hostname', type='string',
help='Host to remove from queue'),
)
def add_arguments(self, parser):
parser.add_argument('--queuename', dest='queuename', type=str,
help='Queue to be removed from')
parser.add_argument('--hostname', dest='hostname', type=str,
help='Host to remove from queue')
def handle(self, **options):
def handle(self, *arg, **options):
if not options.get('queuename'):
raise CommandError('Must specify `--queuename` in order to use command.')
ig = InstanceGroup.objects.filter(name=options.get('queuename'))
@@ -36,4 +33,3 @@ class Command(BaseCommand):
i = i.first()
ig.instances.remove(i)
print("Instance removed from instance group")

View File

@@ -0,0 +1,180 @@
# Copyright (c) 2017 Ansible by Red Hat
# All Rights Reserved.
import sys
import time
import json
from django.utils import timezone
from django.core.management.base import BaseCommand
from awx.main.models import (
UnifiedJob,
Job,
AdHocCommand,
)
from awx.main.consumers import emit_channel_notification
from awx.api.serializers import (
JobEventWebSocketSerializer,
AdHocCommandEventWebSocketSerializer,
)
class ReplayJobEvents():
recording_start = None
replay_start = None
def now(self):
return timezone.now()
def start(self, first_event_created):
self.recording_start = first_event_created
self.replay_start = self.now()
def lateness(self, now, created):
time_passed = now - self.recording_start
job_event_time = created - self.replay_start
return (time_passed - job_event_time).total_seconds()
def get_job(self, job_id):
try:
unified_job = UnifiedJob.objects.get(id=job_id)
except UnifiedJob.DoesNotExist:
print("UnifiedJob {} not found.".format(job_id))
sys.exit(1)
return unified_job.get_real_instance()
def sleep(self, seconds):
time.sleep(seconds)
def replay_elapsed(self):
return (self.now() - self.replay_start)
def recording_elapsed(self, created):
return (created - self.recording_start)
def replay_offset(self, created, speed):
return self.replay_elapsed().total_seconds() - (self.recording_elapsed(created).total_seconds() * (1.0 / speed))
def get_job_events(self, job):
job_events = job.job_events.order_by('created')
if job_events.count() == 0:
raise RuntimeError("No events for job id {}".format(job.id))
return job_events
def get_serializer(self, job):
if type(job) is Job:
return JobEventWebSocketSerializer
elif type(job) is AdHocCommand:
return AdHocCommandEventWebSocketSerializer
else:
raise RuntimeError("Job is of type {} and replay is not yet supported.".format(type(job)))
sys.exit(1)
def run(self, job_id, speed=1.0, verbosity=0):
stats = {
'events_ontime': {
'total': 0,
'percentage': 0,
},
'events_late': {
'total': 0,
'percentage': 0,
'lateness_total': 0,
'lateness_average': 0,
},
'events_total': 0,
'events_distance_total': 0,
'events_distance_average': 0,
'recording_start': 0,
'recording_end': 0,
'recording_duration': 0,
'replay_start': 0,
'replay_end': 0,
'replay_duration': 0,
}
try:
job = self.get_job(job_id)
job_events = self.get_job_events(job)
serializer = self.get_serializer(job)
except RuntimeError as e:
print("{}".format(e.message))
sys.exit(1)
je_previous = None
for je_current in job_events:
if not je_previous:
stats['recording_start'] = je_current.created
self.start(je_current.created)
stats['replay_start'] = self.replay_start
je_previous = je_current
je_serialized = serializer(je_current).data
emit_channel_notification('{}-{}'.format(je_serialized['group_name'], job.id), je_serialized)
replay_offset = self.replay_offset(je_previous.created, speed)
recording_diff = (je_current.created - je_previous.created).total_seconds() * (1.0 / speed)
stats['events_distance_total'] += recording_diff
if verbosity >= 3:
print("recording: next job in {} seconds".format(recording_diff))
if replay_offset >= 0:
replay_diff = recording_diff - replay_offset
if replay_diff > 0:
stats['events_ontime']['total'] += 1
if verbosity >= 3:
print("\treplay: sleep for {} seconds".format(replay_diff))
self.sleep(replay_diff)
else:
stats['events_late']['total'] += 1
stats['events_late']['lateness_total'] += (replay_diff * -1)
if verbosity >= 3:
print("\treplay: too far behind to sleep {} seconds".format(replay_diff))
else:
replay_offset = self.replay_offset(je_current.created, speed)
stats['events_late']['lateness_total'] += (replay_offset * -1)
stats['events_late']['total'] += 1
if verbosity >= 3:
print("\treplay: behind by {} seconds".format(replay_offset))
stats['events_total'] += 1
je_previous = je_current
stats['replay_end'] = self.now()
stats['replay_duration'] = (stats['replay_end'] - stats['replay_start']).total_seconds()
stats['replay_start'] = stats['replay_start'].isoformat()
stats['replay_end'] = stats['replay_end'].isoformat()
stats['recording_end'] = je_current.created
stats['recording_duration'] = (stats['recording_end'] - stats['recording_start']).total_seconds()
stats['recording_start'] = stats['recording_start'].isoformat()
stats['recording_end'] = stats['recording_end'].isoformat()
stats['events_ontime']['percentage'] = (stats['events_ontime']['total'] / float(stats['events_total'])) * 100.00
stats['events_late']['percentage'] = (stats['events_late']['total'] / float(stats['events_total'])) * 100.00
stats['events_distance_average'] = stats['events_distance_total'] / stats['events_total']
stats['events_late']['lateness_average'] = stats['events_late']['lateness_total'] / stats['events_late']['total']
if verbosity >= 2:
print(json.dumps(stats, indent=4, sort_keys=True))
class Command(BaseCommand):
help = 'Replay job events over websockets ordered by created on date.'
def add_arguments(self, parser):
parser.add_argument('--job_id', dest='job_id', type=int, metavar='j',
help='Id of the job to replay (job or adhoc)')
parser.add_argument('--speed', dest='speed', type=int, metavar='s',
help='Speedup factor.')
def handle(self, *args, **options):
job_id = options.get('job_id')
speed = options.get('speed') or 1
verbosity = options.get('verbosity') or 0
replayer = ReplayJobEvents()
replayer.run(job_id, speed, verbosity)

View File

@@ -16,7 +16,7 @@ from kombu.mixins import ConsumerMixin
# Django
from django.conf import settings
from django.core.management.base import NoArgsCommand
from django.core.management.base import BaseCommand
from django.db import connection as django_connection
from django.db import DatabaseError
from django.core.cache import cache as django_cache
@@ -147,7 +147,7 @@ class CallbackBrokerWorker(ConsumerMixin):
logger.error('Detail: {}'.format(tb))
class Command(NoArgsCommand):
class Command(BaseCommand):
'''
Save Job Callback receiver (see awx.plugins.callbacks.job_event_callback)
Runs as a management command and receives job save events. It then hands
@@ -155,8 +155,8 @@ class Command(NoArgsCommand):
'''
help = 'Launch the job callback receiver'
def handle_noargs(self, **options):
with Connection(settings.BROKER_URL) as conn:
def handle(self, *arg, **options):
with Connection(settings.CELERY_BROKER_URL) as conn:
try:
worker = CallbackBrokerWorker(conn)
worker.run()

View File

@@ -1,13 +1,11 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved
from optparse import make_option
# Django
from django.core.management.base import BaseCommand
# AWX
from awx.main.models import * # noqa
from awx.main.models import UnifiedJob
class Command(BaseCommand):
@@ -17,14 +15,13 @@ class Command(BaseCommand):
help = 'Display some simple statistics'
option_list = BaseCommand.option_list + (
make_option('--stat',
action='store',
dest='stat',
type="string",
default="jobs_running",
help='Select which stat to get information for'),
)
def add_arguments(self, parser):
parser.add_argument('--stat',
action='store',
dest='stat',
type=str,
default="jobs_running",
help='Select which stat to get information for')
def job_stats(self, state):
return UnifiedJob.objects.filter(status=state).count()
@@ -34,5 +31,3 @@ class Command(BaseCommand):
self.stdout.write(str(self.job_stats(options['stat'][5:])))
else:
self.stdout.write("Supported stats: jobs_{state}")

View File

@@ -5,7 +5,6 @@ import sys
from awx.main.utils.pglock import advisory_lock
from awx.main.models import InstanceGroup
from optparse import make_option
from django.db import transaction
from django.core.management.base import BaseCommand, CommandError
@@ -17,13 +16,12 @@ class Command(BaseCommand):
"Instances inside of queue will continue to exist, \n"
"but jobs will no longer be processed by queue.")
option_list = BaseCommand.option_list + (
make_option('--queuename', dest='queuename', type='string',
help='Queue to create/update'),
)
def add_arguments(self, parser):
parser.add_argument('--queuename', dest='queuename', type=str,
help='Queue to create/update')
@transaction.atomic
def handle(self, **options):
def handle(self, *args, **options):
queuename = options.get('queuename')
if not queuename:
raise CommandError('Must specify `--queuename` in order to use command.')

View File

@@ -1,9 +1,6 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved
# Python
from optparse import make_option
# Django
from django.core.management.base import BaseCommand
from django.core.management.base import CommandError
@@ -25,12 +22,11 @@ class UpdatePassword(object):
class Command(BaseCommand):
option_list = BaseCommand.option_list + (
make_option('--username', dest='username', action='store', type='string', default=None,
help='username to change the password for'),
make_option('--password', dest='password', action='store', type='string', default=None,
help='new password for user'),
)
def add_arguments(self, parser):
parser.add_argument('--username', dest='username', action='store', type=str, default=None,
help='username to change the password for')
parser.add_argument('--password', dest='password', action='store', type=str, default=None,
help='new password for user')
def handle(self, *args, **options):
if not options['username']:
@@ -43,5 +39,3 @@ class Command(BaseCommand):
if res:
return "Password updated"
return "Password not updated"

View File

@@ -12,8 +12,6 @@ class Migration(migrations.Migration):
replaces = [
(b'main', '0035_v310_remove_tower_settings'),
(b'main', '0036_v311_insights'),
(b'main', '0037_v313_instance_version'),
]
operations = [
@@ -36,11 +34,4 @@ class Migration(migrations.Migration):
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
migrations.AddField(
model_name='instance',
name='version',
field=models.CharField(max_length=24, blank=True),
),
]

View File

@@ -0,0 +1,28 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0005_squashed_v310_v313_updates'),
]
replaces = [
(b'main', '0036_v311_insights'),
]
operations = [
migrations.AlterField(
model_name='project',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
migrations.AlterField(
model_name='projectupdate',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
]

View File

@@ -0,0 +1,24 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0005a_squashed_v310_v313_updates'),
]
replaces = [
(b'main', '0037_v313_instance_version'),
]
operations = [
# Remove Tower settings, these settings are now in separate awx.conf app.
migrations.AddField(
model_name='instance',
name='version',
field=models.CharField(max_length=24, blank=True),
),
]

View File

@@ -18,7 +18,7 @@ from awx.main.models import Host
class Migration(migrations.Migration):
dependencies = [
('main', '0005_squashed_v310_v313_updates'),
('main', '0005b_squashed_v310_v313_updates'),
]
operations = [

View File

@@ -0,0 +1,55 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
from awx.main.migrations import _migration_utils as migration_utils
from awx.main.migrations import _credentialtypes as credentialtypes
from awx.main.migrations._multi_cred import migrate_to_multi_cred
class Migration(migrations.Migration):
dependencies = [
('main', '0008_v320_drop_v1_credential_fields'),
]
operations = [
migrations.AddField(
model_name='unifiedjob',
name='credentials',
field=models.ManyToManyField(related_name='unifiedjobs', to='main.Credential'),
),
migrations.AddField(
model_name='unifiedjobtemplate',
name='credentials',
field=models.ManyToManyField(related_name='unifiedjobtemplates', to='main.Credential'),
),
migrations.RunPython(migration_utils.set_current_apps_for_migrations),
migrations.RunPython(migrate_to_multi_cred),
migrations.RemoveField(
model_name='job',
name='credential',
),
migrations.RemoveField(
model_name='job',
name='extra_credentials',
),
migrations.RemoveField(
model_name='job',
name='vault_credential',
),
migrations.RemoveField(
model_name='jobtemplate',
name='credential',
),
migrations.RemoveField(
model_name='jobtemplate',
name='extra_credentials',
),
migrations.RemoveField(
model_name='jobtemplate',
name='vault_credential',
),
migrations.RunPython(credentialtypes.add_vault_id_field)
]

View File

@@ -0,0 +1,144 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
import awx.main.fields
from awx.main.migrations import _migration_utils as migration_utils
from awx.main.migrations._multi_cred import migrate_workflow_cred, migrate_workflow_cred_reverse
from awx.main.migrations._scan_jobs import remove_scan_type_nodes
class Migration(migrations.Migration):
dependencies = [
('main', '0009_v330_multi_credential'),
]
operations = [
migrations.AddField(
model_name='schedule',
name='char_prompts',
field=awx.main.fields.JSONField(default={}, blank=True),
),
migrations.AddField(
model_name='schedule',
name='credentials',
field=models.ManyToManyField(related_name='schedules', to='main.Credential'),
),
migrations.AddField(
model_name='schedule',
name='inventory',
field=models.ForeignKey(related_name='schedules', on_delete=django.db.models.deletion.SET_NULL, default=None, blank=True, to='main.Inventory', null=True),
),
migrations.AddField(
model_name='schedule',
name='survey_passwords',
field=awx.main.fields.JSONField(default={}, editable=False, blank=True),
),
migrations.AddField(
model_name='workflowjobnode',
name='credentials',
field=models.ManyToManyField(related_name='workflowjobnodes', to='main.Credential'),
),
migrations.AddField(
model_name='workflowjobnode',
name='extra_data',
field=awx.main.fields.JSONField(default={}, blank=True),
),
migrations.AddField(
model_name='workflowjobnode',
name='survey_passwords',
field=awx.main.fields.JSONField(default={}, editable=False, blank=True),
),
migrations.AddField(
model_name='workflowjobtemplatenode',
name='credentials',
field=models.ManyToManyField(related_name='workflowjobtemplatenodes', to='main.Credential'),
),
migrations.AddField(
model_name='workflowjobtemplatenode',
name='extra_data',
field=awx.main.fields.JSONField(default={}, blank=True),
),
migrations.AddField(
model_name='workflowjobtemplatenode',
name='survey_passwords',
field=awx.main.fields.JSONField(default={}, editable=False, blank=True),
),
# Run data migration before removing the old credential field
migrations.RunPython(migration_utils.set_current_apps_for_migrations, migrations.RunPython.noop),
migrations.RunPython(migrate_workflow_cred, migrate_workflow_cred_reverse),
migrations.RunPython(remove_scan_type_nodes, migrations.RunPython.noop),
migrations.RemoveField(
model_name='workflowjobnode',
name='credential',
),
migrations.RemoveField(
model_name='workflowjobtemplatenode',
name='credential',
),
migrations.CreateModel(
name='JobLaunchConfig',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('extra_data', awx.main.fields.JSONField(blank=True, default={})),
('survey_passwords', awx.main.fields.JSONField(blank=True, default={}, editable=False)),
('char_prompts', awx.main.fields.JSONField(blank=True, default={})),
('credentials', models.ManyToManyField(related_name='joblaunchconfigs', to='main.Credential')),
('inventory', models.ForeignKey(blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='joblaunchconfigs', to='main.Inventory')),
('job', models.OneToOneField(editable=False, on_delete=django.db.models.deletion.CASCADE, related_name='launch_config', to='main.UnifiedJob')),
],
),
migrations.AddField(
model_name='workflowjobtemplate',
name='ask_variables_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_credential_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_diff_mode_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_inventory_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_job_type_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_limit_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_skip_tags_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_tags_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_variables_on_launch',
field=awx.main.fields.AskForField(default=False),
),
migrations.AlterField(
model_name='jobtemplate',
name='ask_verbosity_on_launch',
field=awx.main.fields.AskForField(default=False),
),
]

View File

@@ -0,0 +1,22 @@
# -*- coding: utf-8 -*-
# Python
from __future__ import unicode_literals
# Django
from django.db import migrations, models
# AWX
from awx.main.migrations import _migration_utils as migration_utils
from awx.main.migrations._reencrypt import blank_old_start_args
class Migration(migrations.Migration):
dependencies = [
('main', '0010_saved_launchtime_configs'),
]
operations = [
migrations.RunPython(migration_utils.set_current_apps_for_migrations, migrations.RunPython.noop),
migrations.RunPython(blank_old_start_args, migrations.RunPython.noop),
]

View File

@@ -0,0 +1,23 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.7 on 2017-12-11 16:40
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('main', '0011_blank_start_args'),
]
operations = [
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='workflow_job_template',
field=models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, related_name='workflow_job_template_nodes', to='main.WorkflowJobTemplate'),
preserve_default=False,
),
]

View File

@@ -0,0 +1,44 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.7 on 2017-12-12 18:56
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0012_non_blank_workflow'),
]
operations = [
migrations.AlterField(
model_name='unifiedjob',
name='result_stdout_text',
field=models.TextField(editable=False, null=True),
),
# Using SeparateDatabaseAndState here allows us to update the migration
# state so that Django thinks the UnifiedJob.result_stdout_text field
# is gone _without_ actually deleting the underlying column/data
migrations.SeparateDatabaseAndState(state_operations=[
migrations.RemoveField(
model_name='unifiedjob',
name='result_stdout_text',
),
]),
# On other side of the equation, this migration introduces a new model
# which is *unmanaged* (meaning, a new table is not created for it);
# instead, this sort of "virtual" model is used to maintain an ORM
# reference to the actual `main_unifiedjob.result_stdout_text` column
migrations.CreateModel(
name='UnifiedJobDeprecatedStdout',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('result_stdout_text', models.TextField(editable=False, null=True))
],
options={
'db_table': 'main_unifiedjob',
'managed': False,
},
),
]

View File

@@ -173,3 +173,8 @@ def migrate_job_credentials(apps, schema_editor):
finally:
utils.get_current_apps = orig_current_apps
def add_vault_id_field(apps, schema_editor):
vault_credtype = CredentialType.objects.get(kind='vault')
vault_credtype.inputs = CredentialType.defaults.get('vault')().inputs
vault_credtype.save()

View File

@@ -0,0 +1,34 @@
def migrate_to_multi_cred(app, schema_editor):
Job = app.get_model('main', 'Job')
JobTemplate = app.get_model('main', 'JobTemplate')
for cls in (Job, JobTemplate):
for j in cls.objects.iterator():
if j.credential:
j.credentials.add(j.credential)
if j.vault_credential:
j.credentials.add(j.vault_credential)
for cred in j.extra_credentials.all():
j.credentials.add(cred)
def migrate_workflow_cred(app, schema_editor):
WorkflowJobTemplateNode = app.get_model('main', 'WorkflowJobTemplateNode')
WorkflowJobNode = app.get_model('main', 'WorkflowJobNode')
for cls in (WorkflowJobNode, WorkflowJobTemplateNode):
for node in cls.objects.iterator():
if node.credential:
node.credentials.add(j.credential)
def migrate_workflow_cred_reverse(app, schema_editor):
WorkflowJobTemplateNode = app.get_model('main', 'WorkflowJobTemplateNode')
WorkflowJobNode = app.get_model('main', 'WorkflowJobNode')
for cls in (WorkflowJobNode, WorkflowJobTemplateNode):
for node in cls.objects.iterator():
cred = node.credentials.first()
if cred:
node.credential = cred
node.save()

View File

@@ -74,3 +74,20 @@ def _unified_jobs(apps):
uj.start_args = decrypt_field(uj, 'start_args')
uj.start_args = encrypt_field(uj, 'start_args')
uj.save()
def blank_old_start_args(apps, schema_editor):
UnifiedJob = apps.get_model('main', 'UnifiedJob')
for uj in UnifiedJob.objects.defer('result_stdout_text').exclude(start_args='').iterator():
if uj.status in ['running', 'pending', 'new', 'waiting']:
continue
try:
args_dict = decrypt_field(uj, 'start_args')
except ValueError:
args_dict = None
if args_dict == {}:
continue
if uj.start_args:
logger.debug('Blanking job args for %s', uj.pk)
uj.start_args = ''
uj.save()

View File

@@ -12,7 +12,7 @@ logger = logging.getLogger('awx.main.migrations')
def _create_fact_scan_project(ContentType, Project, org):
ct = ContentType.objects.get_for_model(Project)
name = "Tower Fact Scan - {}".format(org.name if org else "No Organization")
name = u"Tower Fact Scan - {}".format(org.name if org else "No Organization")
proj = Project(name=name,
scm_url='https://github.com/ansible/awx-facts-playbooks',
scm_type='git',
@@ -82,3 +82,21 @@ def _migrate_scan_job_templates(apps):
def migrate_scan_job_templates(apps, schema_editor):
_migrate_scan_job_templates(apps)
def remove_scan_type_nodes(apps, schema_editor):
WorkflowJobTemplateNode = apps.get_model('main', 'WorkflowJobTemplateNode')
WorkflowJobNode = apps.get_model('main', 'WorkflowJobNode')
for cls in (WorkflowJobNode, WorkflowJobTemplateNode):
for node in cls.objects.iterator():
prompts = node.char_prompts
if prompts.get('job_type', None) == 'scan':
log_text = '{} set job_type to scan, which was deprecated in 3.2, removing.'.format(cls)
if cls == WorkflowJobNode:
logger.info(log_text)
else:
logger.debug(log_text)
prompts.pop('job_type')
node.char_prompts = prompts
node.save()

View File

@@ -145,17 +145,3 @@ activity_stream_registrar.connect(WorkflowJob)
# prevent API filtering on certain Django-supplied sensitive fields
prevent_search(User._meta.get_field('password'))
# Always, always, always defer result_stdout_text for polymorphic UnifiedJob rows
# TODO: remove this defer in 3.3 when we implement https://github.com/ansible/ansible-tower/issues/5436
def defer_stdout(f):
def _wrapped(*args, **kwargs):
objs = f(*args, **kwargs)
objs.query.deferred_loading[0].add('result_stdout_text')
return objs
return _wrapped
for cls in UnifiedJob.__subclasses__():
cls.base_objects.filter = defer_stdout(cls.base_objects.filter)

View File

@@ -3,8 +3,6 @@
# Python
import datetime
import hashlib
import hmac
import logging
from urlparse import urljoin
@@ -156,13 +154,6 @@ class AdHocCommand(UnifiedJob, JobNotificationMixin):
def get_ui_url(self):
return urljoin(settings.TOWER_URL_BASE, "/#/ad_hoc_commands/{}".format(self.pk))
@property
def task_auth_token(self):
'''Return temporary auth token used for task requests via API.'''
if self.status == 'running':
h = hmac.new(settings.SECRET_KEY, self.created.isoformat(), digestmod=hashlib.sha1)
return '%d-%s' % (self.pk, h.hexdigest())
@property
def notification_templates(self):
all_orgs = set()

View File

@@ -1,13 +1,6 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
# Python
import json
import shlex
# PyYAML
import yaml
# Django
from django.db import models
from django.core.exceptions import ValidationError, ObjectDoesNotExist
@@ -21,7 +14,7 @@ from taggit.managers import TaggableManager
from crum import get_current_user
# AWX
from awx.main.utils import encrypt_field
from awx.main.utils import encrypt_field, parse_yaml_or_json
__all__ = ['prevent_search', 'VarsDictProperty', 'BaseModel', 'CreatedModifiedModel',
'PasswordFieldsModel', 'PrimordialModel', 'CommonModel',
@@ -42,6 +35,11 @@ JOB_TYPE_CHOICES = [
(PERM_INVENTORY_SCAN, _('Scan')),
]
NEW_JOB_TYPE_CHOICES = [
(PERM_INVENTORY_DEPLOY, _('Run')),
(PERM_INVENTORY_CHECK, _('Check')),
]
AD_HOC_JOB_TYPE_CHOICES = [
(PERM_INVENTORY_DEPLOY, _('Run')),
(PERM_INVENTORY_CHECK, _('Check')),
@@ -80,26 +78,7 @@ class VarsDictProperty(object):
if hasattr(v, 'items'):
return v
v = v.encode('utf-8')
d = None
try:
d = json.loads(v.strip() or '{}')
except ValueError:
pass
if d is None:
try:
d = yaml.safe_load(v)
# This can happen if the whole file is commented out
if d is None:
d = {}
except yaml.YAMLError:
pass
if d is None and self.key_value:
d = {}
for kv in [x.decode('utf-8') for x in shlex.split(v, posix=True)]:
if '=' in kv:
k, v = kv.split('=', 1)
d[k] = v
return d if hasattr(d, 'items') else {}
return parse_yaml_or_json(v)
def __set__(self, obj, value):
raise AttributeError('readonly property')

Some files were not shown because too many files have changed in this diff Show More