Compare commits

..

184 Commits
2.1.1 ... 3.0.0

Author SHA1 Message Date
softwarefactory-project-zuul[bot]
4788f0814f Merge pull request #3045 from jakemcdermott/regenerate-package-lock
updating package-lock.json

Reviewed-by: Shane McDonald <me@shanemcd.com>
             https://github.com/shanemcd
2019-01-22 15:01:43 +00:00
softwarefactory-project-zuul[bot]
c528ece5df Merge pull request #3046 from shanemcd/3.0.0
AWX 3.0.0

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-22 14:58:33 +00:00
Jake McDermott
a1c03cd6a1 update license files 2019-01-22 09:33:13 -05:00
Shane McDonald
42fbb81337 AWX 3.0.0 2019-01-22 09:32:39 -05:00
softwarefactory-project-zuul[bot]
5286e24721 Merge pull request #3044 from ryanpetrello/dispatcher-reap-db-error
detect dead DB connections in the dispatcher when reaping jobs

Reviewed-by: Chris Meyers
             https://github.com/chrismeyersfsu
2019-01-22 14:32:34 +00:00
Jake McDermott
0bde309d23 updating package-lock.json 2019-01-22 09:04:27 -05:00
Ryan Petrello
b2442d42a3 detect dead DB connections in the dispatcher when reaping jobs 2019-01-22 08:40:26 -05:00
softwarefactory-project-zuul[bot]
18409f89c5 Merge pull request #3042 from ryanpetrello/py3-sso-complete
fix a py3 bug that breaks the SSO complete endpoint

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-22 13:12:41 +00:00
softwarefactory-project-zuul[bot]
88d5fb0420 Merge pull request #3039 from AlanCoding/inventory_venv
Use custom virtual environment in inventory updates

Reviewed-by: Ryan Petrello
             https://github.com/ryanpetrello
2019-01-22 13:12:37 +00:00
softwarefactory-project-zuul[bot]
1cc0f81913 Merge pull request #3034 from rooftopcellist/upgrade_social_auth_core
upgrade social-auth-core to v3.0.0

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-21 22:10:50 +00:00
Ryan Petrello
8cb8e63db5 fix a py3 bug that breaks the SSO complete endpoint 2019-01-21 17:04:13 -05:00
Christian Adams
8597670299 upgrade social-auth-core to v3.0.0 2019-01-21 16:35:47 -05:00
softwarefactory-project-zuul[bot]
c0ff4dad59 Merge pull request #3021 from jakemcdermott/credential_input_access_methods
add input access methods to credentials

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-21 21:10:12 +00:00
softwarefactory-project-zuul[bot]
d98c60519e Merge pull request #2895 from AlanCoding/scm_vars
Allow SCM overwrite vars in the UI

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-21 19:13:47 +00:00
AlanCoding
5dd8c3ace2 Allow SCM overwrite vars in the UI 2019-01-21 13:27:57 -05:00
softwarefactory-project-zuul[bot]
d021c253aa Merge pull request #2959 from crab86/devel
Add Grafana notification type

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-21 17:21:14 +00:00
softwarefactory-project-zuul[bot]
c48c8c04f4 Merge pull request #3038 from Spredzy/x_frame_options
Nginx: Specify X-Frame-Options "DENY" header

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-21 15:31:44 +00:00
softwarefactory-project-zuul[bot]
a2102c92ec Merge pull request #3030 from ryanpetrello/fix-non-utf8-scm
add robust handling of non-UTF8 when detecting inventory/playbooks

Reviewed-by: Ryan Petrello
             https://github.com/ryanpetrello
2019-01-21 13:52:55 +00:00
AlanCoding
99288a5e18 Use custom virtual environment in inventory updates 2019-01-21 06:45:55 -05:00
Yanis Guenane
44c48d1d66 Nginx: Specify X-Frame-Options "DENY" header
Adding the X-Frame-Options "DENY"; header to avoid possible clickjacking
attack.

More info of the why available here:
https://www.owasp.org/index.php/Testing_for_Clickjacking_(OTG-CLIENT-009)

Signed-off-by: Yanis Guenane <yguenane@redhat.com>
2019-01-21 12:34:17 +01:00
Sebastian
ebe0ded9c2 Add grafana notification type unit tests 2019-01-20 22:42:03 +01:00
Jake McDermott
2dadfbcc14 use credential input access methods in injectors.py 2019-01-20 14:02:01 -05:00
Jake McDermott
3a58a5b772 use credential input access methods in views/__init__.py 2019-01-20 13:08:41 -05:00
Jake McDermott
5010e98b8f use credential input access methods in projects.py 2019-01-20 13:08:38 -05:00
Jake McDermott
3ef4cc9bfa use credential input access methods in serializers.py 2019-01-20 13:08:34 -05:00
Jake McDermott
c01c671642 use credential input access methods in tasks.py 2019-01-20 13:08:30 -05:00
Jake McDermott
a86e270905 add credential input access methods 2019-01-20 13:08:23 -05:00
Sebastian
4058d18593 Add grafana notification type 2019-01-20 13:51:23 +01:00
Ryan Petrello
caa55f112f add robust handling of non-UTF8 when detecting inventory/playbooks 2019-01-19 13:25:41 -05:00
softwarefactory-project-zuul[bot]
d0af952685 Merge pull request #3027 from rooftopcellist/amend_auth_code_help_txt
correct authorization code expiration help-text

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-18 20:37:21 +00:00
softwarefactory-project-zuul[bot]
fbc7f496c5 Merge pull request #3024 from ryanpetrello/ldap-gc-deadlock
fix a deadlock when Python garbage collects LDAPBackend objects

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-18 19:25:11 +00:00
John Mitchell
bb19a4234e Merge pull request #3003 from jlmitch5/newSettingsInUI
add new settings to ui
2019-01-18 14:20:31 -05:00
softwarefactory-project-zuul[bot]
11f7e90f6a Merge pull request #3025 from ryanpetrello/django-cors-headers
add Django CORS middleware

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-18 18:47:14 +00:00
Christian Adams
5c080678a6 correct authorization code expiration help-text 2019-01-18 13:27:01 -05:00
John Mitchell
2df51a923d change grant reference to code in ui help text 2019-01-18 13:13:15 -05:00
Tyler Cross
0da0a8e67b CORS Support
Added the django-cors-headers app and middleware to make CORS possible.
2019-01-18 12:49:00 -05:00
John Mitchell
b75ba7ebea remove auth misc form and move fields under system misc form 2019-01-18 12:43:21 -05:00
John Mitchell
24de951f6c add access token and authorization code expiration settings to ui 2019-01-18 12:43:21 -05:00
John Mitchell
974306541e add isolated settings to ui 2019-01-18 12:43:21 -05:00
softwarefactory-project-zuul[bot]
d2fa5cc182 Merge pull request #3013 from ryanpetrello/ansible-py3-venv
add support for custom py3 ansible virtualenvs

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-18 16:29:27 +00:00
Ryan Petrello
e45e4b3cda fix a deadlock when Python garbage collects LDAPBackend objects
we shouldn't call signal.disconnect in __del__ because it can lead to
deadlocks in Django signal dispatch code

The Signal.connect, Signal.disconnect, and Signal._live_receivers
methods all share a threading.Lock():

22a60f8d0b/django/dispatch/dispatcher.py (L49)

It's possible for this to lead to a deadlock:

1.  Have code that calls Signal._live_receivers and enter the critical
    path inside the shared threading.Lock()
2.  Python garbage collection occurs and finds one or more LDAPBackend
    objects with no more references
3.  This __del__ is called, which calls Signal.disconnect
4.  Code in Signal._disconnect attempts to obtain the (already held)
    threading.Lock
5.  Python hangs forever while attempting to garbage collect
2019-01-18 11:27:50 -05:00
Ryan Petrello
65641c7edd add support for custom py3 ansible virtualenvs 2019-01-18 10:55:53 -05:00
softwarefactory-project-zuul[bot]
5f01c3f5a8 Merge pull request #2994 from coreywan/pod-limits
Add POD Limits

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-18 04:28:11 +00:00
softwarefactory-project-zuul[bot]
7b39198f26 Merge pull request #2995 from coreywan/postgres_helm
adds persistence.storageClass and limits to postgress helm install

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-18 04:24:18 +00:00
softwarefactory-project-zuul[bot]
f583dd73e8 Merge pull request #3008 from AlanCoding/inv_cleanup2
Remove deprecated logic & components from inventory import command

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-17 19:23:02 +00:00
softwarefactory-project-zuul[bot]
57b8aa4892 Merge pull request #3002 from themr0c/pg_password_10_character_limit
pg_password should be random 10 character alphanumeric string, when p…

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-17 18:15:38 +00:00
softwarefactory-project-zuul[bot]
474876872e Merge pull request #2999 from themr0c/issue-2991
related #2991 - Helm creation of postgreql on multiple namespaces

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-17 14:28:05 +00:00
softwarefactory-project-zuul[bot]
3eaed52b83 Merge pull request #3017 from ryanpetrello/beat-shelve-error
close the persistent shelve when we're done checking it

Reviewed-by: Alan Rominger <arominge@redhat.com>
             https://github.com/AlanCoding
2019-01-17 13:46:09 +00:00
AlanCoding
28822d891c remove unneeded steps in inventory import
Delete some cases that directly loads scripts due
to ansible-inventory group_vars problem (now fixed)

Delete intermediate method that was a go-between the
command and the loader class

Change return type of loader from MemInventory to
a simple python dict

remove backport script and star imports
2019-01-17 08:44:55 -05:00
Ryan Petrello
37dbfa88f9 close the persistent shelve when we're done checking it
this code was added to detect celerybeat shelve .db corruption, but it caused a different issue; opening the shelve in this way puts a write lock on it, which means that when we attempt to open it _again_ moments later, we can't.

when we're done checking the validity of the file, we need to close it
2019-01-17 08:20:21 -05:00
Fabrice Flore-Thebault
b6c30e8ef5 it's a limitation of the official postgres helm chart
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-17 12:56:17 +01:00
Fabrice Flore-Thebault
d938c96a76 pg_password should be random 10 character alphanumeric string, when postgresql is running on kubernetes
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-17 12:56:06 +01:00
softwarefactory-project-zuul[bot]
4ce18618cb Merge pull request #3014 from jakemcdermott/skip-ui-release-chromium-download
skip chromium download when using ui-release make target

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-17 04:58:22 +00:00
Jake McDermott
6c7f11395b skip chromium download when building release 2019-01-16 20:48:12 -05:00
softwarefactory-project-zuul[bot]
134950ade1 Merge pull request #3006 from ansible/documentation
Documentation of E2E test fixtures, etc.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-16 23:43:46 +00:00
Daniel Sami
7258a43bad rewording, typo corrections 2019-01-17 08:28:22 +09:00
Ryan Petrello
27f98163ff Merge pull request #3012 from ryanpetrello/fix-swagger-key-ordering
enforce key order when writing swagger docs JSON
2019-01-16 16:43:43 -05:00
Ryan Petrello
6d04bd34ce enforce key order when writing swagger docs JSON 2019-01-16 16:00:27 -05:00
softwarefactory-project-zuul[bot]
584ec9cf75 Merge pull request #2538 from ryanpetrello/py3
port awx to run natively on python3.6+

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-16 19:39:14 +00:00
Corey Wanless
aebeeb170e adds pod limits
Signed-off-by: Corey Wanless <corey.wanless@wwt.com>
2019-01-16 09:23:18 -06:00
Fabrice Flore-Thebault
c434d38876 adding helm chart version for postgresql
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-16 09:40:49 +01:00
Daniel Sami
ae3ab89515 lint fix 2019-01-16 11:06:25 +09:00
Daniel Sami
0c250cd6af Updated parameter info 2019-01-16 10:44:22 +09:00
Ryan Petrello
33c1416f6c work around a py3 bug in celerybeat 2019-01-15 15:19:02 -05:00
Ryan Petrello
3d7fcb3835 fix a few isolated issues related to the py2 -> py3 move 2019-01-15 14:09:05 -05:00
Shane McDonald
04da4503db Python 3 / Upstream Kubernetes 2019-01-15 14:09:05 -05:00
Ryan Petrello
2016798e0f fix a few UTF-8 bugs on Ubuntu related to stdout text downloads 2019-01-15 14:09:05 -05:00
Ryan Petrello
39d119534c support isolated runs in py2 *and* py3 (for now)
once we merge in runner support for isolated environments, we can
revert this commit (because we'll always run isolated code using python3
executables)
2019-01-15 14:09:05 -05:00
Shane McDonald
d273472927 Enable py3 SCL if needed 2019-01-15 14:09:05 -05:00
Shane McDonald
5aa99b2ca1 Dependency updates for Python 3 2019-01-15 14:09:05 -05:00
Ryan Petrello
96b9bd6ab6 make py3 packaging work for k8s 2019-01-15 14:09:05 -05:00
Author: Jim Ladd
2c5bdf3611 fix some isolated py3 bugs 2019-01-15 14:09:05 -05:00
Ryan Petrello
af4234556e remove dm.xmlsec.binding
python-saml uses dm.xmlsec.binding only supports python2
by moving to py3, we now use python3-saml (which uses python-xmlsec
instead)

see: https://github.com/onelogin/python-saml/issues/145#issuecomment-222021691
2019-01-15 14:09:05 -05:00
Ryan Petrello
c6482137d1 parametrize PYTHON for Ubuntu py35 support 2019-01-15 14:09:05 -05:00
Ryan Petrello
f223df303f convert py2 -> py3 2019-01-15 14:09:01 -05:00
Ryan Petrello
f132ce9b64 switch image builds to py3 2019-01-15 13:25:13 -05:00
softwarefactory-project-zuul[bot]
f22fd58392 Merge pull request #3007 from AlanCoding/test_logging
Updates to logging, specifically for unit tests

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-15 18:19:58 +00:00
AlanCoding
cccc038600 Updates to logging, specifically for unit tests 2019-01-15 11:34:54 -05:00
softwarefactory-project-zuul[bot]
b9607dd415 Merge pull request #2983 from mabashian/upgrade-jquery-mabashian
Upgrades jQuery and Bootstrap

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-15 14:06:33 +00:00
Fabrice Flore-Thebault
7b32262f75 revert pg_hostname
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-15 14:59:17 +01:00
Fabrice Flore-Thebault
d69f6acf64 add helm repo update and fix helm upgrade
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-15 14:48:26 +01:00
mabashian
66a859872e Fixes for numerous bootstrap upgrade bugs, uses variables for colors in bootstrap override style file 2019-01-15 08:40:35 -05:00
Fabrice Flore-Thebault
ef3aab1357 related #2991 - unify postgresql_service_name
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-15 11:44:08 +01:00
Daniel Sami
62ebf85b96 Documentation of functions 2019-01-14 19:18:47 -05:00
Corey Wanless
0c074e0988 * adds persistence.storageClass and limits to postgress helm install
* adds new variables to the inventory

Signed-off-by: Corey Wanless <corey.wanless@wwt.com>
2019-01-14 11:28:21 -06:00
softwarefactory-project-zuul[bot]
32c705a62a Merge pull request #2996 from coreywan/setup-postgress-activation-wait
adds wait time for postgres setup as a variable

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-14 17:22:54 +00:00
mabashian
7194338653 Reduces flake on launch job e2e test 2019-01-14 10:54:37 -05:00
Fabrice Flore-Thebault
d43521bb77 fix #2991 - make Helm creation of postgreql succeed when installing multiple AWX on different namespaces on same kubernetes
Signed-off-by: Fabrice Flore-Thebault <themr0c@users.noreply.github.com>
2019-01-14 10:32:21 +01:00
Corey Wanless
b1710f9523 adds wait time for postgres setup as a variable 2019-01-11 22:23:43 -06:00
mabashian
3b456d3e72 Fix credential list e2e test 2019-01-11 16:44:59 -05:00
softwarefactory-project-zuul[bot]
12a04a6da6 Merge pull request #2956 from ansible/swap-diff-order-schema-change
swap file order so diff of schema makes more sense

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-11 20:11:42 +00:00
mabashian
99205fde16 Fixes linting errors 2019-01-11 13:06:52 -05:00
mabashian
8539eae114 Fixes for e2e tests 2019-01-11 12:50:01 -05:00
softwarefactory-project-zuul[bot]
3e2dd4f86b Merge pull request #2993 from ryanpetrello/fix-ctint-db-failure
catch _all_ types of django.db.utils.Error on CTinT key lookups

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-11 14:56:11 +00:00
Ryan Petrello
32c14d6eab catch _all_ types of django.db.utils.Error on CTinT key lookups 2019-01-11 08:49:47 -05:00
softwarefactory-project-zuul[bot]
4f9901db38 Merge pull request #2867 from AlanCoding/supervisor_names
Make docker environment interoperable with supervisorctl commands

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-11 13:28:35 +00:00
softwarefactory-project-zuul[bot]
a77c981e0c Merge pull request #2992 from AlanCoding/opt_dashboard
Optimize dashboard using Django annotation for sum

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-01-11 13:15:42 +00:00
AlanCoding
77d2364022 Make docker environment interoperable with supervisorctl commands 2019-01-10 13:41:15 -05:00
AlanCoding
d1b42fd583 Optimize dashboard using Django annotation for sum 2019-01-10 12:22:39 -05:00
mabashian
2dfb0abb69 Fixes jshint errors 2019-01-09 11:45:51 -05:00
mabashian
7bcbaabd71 Removed extraneous comments 2019-01-09 11:03:51 -05:00
mabashian
9c20e1b494 Upgrades jquery and bootstrap 2019-01-09 11:03:51 -05:00
softwarefactory-project-zuul[bot]
2b5210842d Merge pull request #2977 from MarBra/devel
Fix typo in ca_trust_dir

Reviewed-by: Bill Nottingham
             https://github.com/wenottingham
2019-01-07 19:42:50 +00:00
marcel
0b3e51458d Fix typo in ca_trust_dir
The correct path is used in docker-compose template:
- "{{ ca_trust_dir +':/etc/pki/ca-trust/source/anchors:ro' }}"
2019-01-07 19:29:34 +01:00
Chris Meyers
d57fc998d5 Merge pull request #2972 from wwitzel3/devel
update to the latest asgi-amqp
2019-01-03 08:22:05 -05:00
Wayne Witzel III
1079051b12 update to the latest asgi-amqp 2019-01-03 07:52:16 -05:00
Chris Meyers
4641056829 Merge pull request #2942 from chrismeyersfsu/doc-tm_affinity
add docs for task manager node decider
2019-01-02 13:57:58 -05:00
chris meyers
db2bb19d65 add docs for task manager node decider 2019-01-02 12:17:28 -05:00
Elijah DeLee
e5ad2e44fb swap file order so diff of schema makes more sense
This way we will get +'s next to new content :)
2018-12-20 15:05:01 -05:00
softwarefactory-project-zuul[bot]
aa8cda0001 Merge pull request #2801 from ryanpetrello/more-robust-isolated-capacity
collect isolated capacity using a cache plugin, not stdout parsing

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-20 20:01:00 +00:00
softwarefactory-project-zuul[bot]
949f383564 Merge pull request #2947 from wenottingham/spelling-is-gud
Fix 'credential' typo.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-19 13:54:47 +00:00
Bill Nottingham
479ad13630 Fix some more typos while here. 2018-12-18 16:23:17 -05:00
Bill Nottingham
23c2e1be31 Fix 'credential' typo. 2018-12-18 16:12:10 -05:00
softwarefactory-project-zuul[bot]
7628ef01f1 Merge pull request #2938 from mabashian/wf-details-click
Updates to workflow node details link

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-18 12:06:33 +00:00
mabashian
c0730aa562 Prevent mousedown on details link from triggering pan functionality 2018-12-17 15:20:34 -05:00
mabashian
67d6a9f9ea Fixes display of wf node details link in FF by adding height and width 2018-12-17 14:33:48 -05:00
mabashian
f9854abfa1 Fixed linting error 2018-12-17 10:40:20 -05:00
mabashian
2697615dbf Undo GET request that was made for workflow node jobs missing a type and instead leverage the redirect route. The workflow node results redirect now works for all job types. 2018-12-17 10:26:01 -05:00
softwarefactory-project-zuul[bot]
a131250dc1 Merge pull request #2917 from AlanCoding/relaunch_sjt
Fx bug where some SJs could not be relaunched

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-17 15:07:32 +00:00
Michael Abashian
2d237b6dbb Merge pull request #2 from jakemcdermott/wf-details-click
add output redirect route for workflow playbook nodes
2018-12-17 09:41:41 -05:00
Jake McDermott
c8b15005b4 link to workflow playbook node route from workflow viewer 2018-12-14 19:22:23 -05:00
Jake McDermott
a5c4350695 add redirect route for workflow viewer 2018-12-14 19:20:01 -05:00
mabashian
9f18f8dbdb Fixes split job inside workflow details link bug 2018-12-14 18:04:39 -05:00
mabashian
a8e1c8960f Remove details function 2018-12-14 16:27:35 -05:00
mabashian
7f66053654 Changed workflow node details link to href so that i can be opened in new tab 2018-12-14 14:59:19 -05:00
softwarefactory-project-zuul[bot]
1d6c88b7e2 Merge pull request #2933 from ryanpetrello/openshift-ha-policy
configure an HA policy for openshift/k8s installs

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-14 19:45:26 +00:00
softwarefactory-project-zuul[bot]
049f85f3c9 Merge pull request #2937 from mabashian/2932-schedule-prompts
Fixes bug scheduling jt where first survey question is optional

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-14 19:37:03 +00:00
Ryan Petrello
4858868428 configure an HA policy for openshift/k8s installs 2018-12-14 14:08:30 -05:00
mabashian
4e37076955 Fixes bug scheduling jt where first survey question is optional 2018-12-14 14:08:06 -05:00
softwarefactory-project-zuul[bot]
e6f654b568 Merge pull request #2924 from AlanCoding/run_as_root
Catch python error when unable to find user

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-13 18:01:55 +00:00
AlanCoding
65e110cdbf catch python error when unable to find user 2018-12-13 11:47:54 -05:00
softwarefactory-project-zuul[bot]
10e99c76a8 Merge pull request #2921 from GitStorm/patch-1
tower-cli config: host value needs to be URL

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-13 13:13:42 +00:00
GitStorm
b0e3bc96dd tower-cli config: host value needs to be URL
Since the key name "host" is slightly misleading, it would help to point out in this documentation, that in fact an URL is required for that key/value pair "host" in the tower-cli config. Failing to do so drops the follwing error:

Error: There was a network error of some kind trying to connect to Tower.
The most common  reason for this is a settings issue; is your "host" value in `tower-cli config` correct?
2018-12-13 11:52:56 +01:00
softwarefactory-project-zuul[bot]
59df54b363 Merge pull request #2919 from ryanpetrello/dispatcher-task-import-hardening
only allow the task dispatch worker to import and run decorated tasks

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-13 01:12:48 +00:00
Ryan Petrello
5950f26c69 only allow the task dispatch worker to import and run decorated tasks
this _technically_ prevents a remote code exploit where a user who has
access to publish AMQP messages to the dispatch queue could craft
a special message that would import and run arbitrary Python functions;
that said, the types of user with this privilege level are generally
_already_ the awx user (so they can already do this by hand if they
want)
2018-12-12 17:46:41 -05:00
AlanCoding
a3a5c6bf9f fix bug where some SJs could not be relaunched 2018-12-12 11:56:57 -05:00
softwarefactory-project-zuul[bot]
ca16787e7c Merge pull request #2697 from ryanpetrello/smarter-websocket-expiry
stop various async background requests from bumping the session expiry

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-12 15:39:20 +00:00
softwarefactory-project-zuul[bot]
271bd10b47 Merge pull request #2907 from shanemcd/devel
Bump version to 2.1.2

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-11 17:59:17 +00:00
Shane McDonald
e3872ebd58 Bump version to 2.1.2 2018-12-11 12:30:10 -05:00
softwarefactory-project-zuul[bot]
eee716644b Merge pull request #2875 from MrMEEE/patch-1
Bumped Version to 2.1.1

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-11 17:25:26 +00:00
Ryan Petrello
c2660af60d stop various async background requests from bumping the session expiry
if a user has an active session that just sits on the dashboard or job
list, websocket messages that come in (for e.g., job status changes)
will trigger AJAX requests for more data; this process causes a user
with an idle login to continue to generate API requests, which in turn
ticks their expiry timer.  As a result, users with active sessions
sitting on these two (popular) pages will never be automatically logged
out via SESSION_MAX_AGE.

this change introduces a special header that the UI can use to signify
that a request shouldn't bump the expiry timer
2018-12-11 09:15:58 -05:00
softwarefactory-project-zuul[bot]
2758a38485 Merge pull request #2898 from mabashian/2851-perms
Fixes bug where admin/member roles weren't showing up when adding user to org

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-10 16:51:03 +00:00
softwarefactory-project-zuul[bot]
9104f485e6 Merge pull request #2856 from wenottingham/one-small-step-foreman
Update foreman.py from Ansible devel, primarily for unicode fixes.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-10 16:27:52 +00:00
mabashian
ae7361f82d Fixes bug where admin/member roles weren't showing up when adding user to org 2018-12-10 11:12:09 -05:00
softwarefactory-project-zuul[bot]
42562e86e4 Merge pull request #2888 from mabashian/sanitize-app-token-list
Sanitize username and description in application tokens list

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-07 21:42:38 +00:00
softwarefactory-project-zuul[bot]
982ed37b06 Merge pull request #2890 from mabashian/3198-survey
Fixes bug launching jt where first survey question is optional and empty

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-07 20:40:07 +00:00
mabashian
c0c666cc87 Fixes bug launching jt where first survey question is optional and empty 2018-12-07 15:12:22 -05:00
softwarefactory-project-zuul[bot]
8a284889f5 Merge pull request #2889 from mabashian/2887-workflow-cred
Properly POST credentials to workflow nodes

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-07 20:08:36 +00:00
mabashian
b891e2c204 Properly POST credentials to workflow nodes 2018-12-07 14:37:56 -05:00
mabashian
a8bf7366cf Sanitize username and description in application tokens list 2018-12-07 14:24:12 -05:00
softwarefactory-project-zuul[bot]
e517f81b8f Merge pull request #2880 from AlanCoding/fix_v1_links
Fix links to some resources that lack v1 pages

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-07 14:33:15 +00:00
softwarefactory-project-zuul[bot]
c4c99332fc Merge pull request #2873 from ansible/related_slices
Show type in related_jobs, link based on type

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-06 20:51:00 +00:00
AlanCoding
40b5ce4b2e link v1 pages to v2 credential type page 2018-12-06 15:41:26 -05:00
AlanCoding
d2cd337c1f fix links to some resources that lack v1 pages 2018-12-06 08:29:23 -05:00
softwarefactory-project-zuul[bot]
b9913fb4f9 Merge pull request #2877 from chrismeyersfsu/improvement-better_default_loggers
more sane default log handlers

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-05 15:59:12 +00:00
chris meyers
d1705dd0cc more sane default log handlers
* Removed the emailing of admins on request error. When turned on, the
handler will include all django settings in the email. This is not
desirable from a security standpoint.
2018-12-05 09:38:27 -05:00
Martin Juhl
816cc29132 Bumped Version to 2.1.1 2018-12-05 00:33:04 +01:00
AlanCoding
f09b8efa87 tests and optimizations for UJT list with non-joblet recent_jobs 2018-12-04 16:16:05 -05:00
softwarefactory-project-zuul[bot]
6ebc6809eb Merge pull request #2869 from AlanCoding/syncs_do_not_update_project
Do not update project details after sync jobs

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-04 20:10:01 +00:00
AlanCoding
4b31367945 Do not update project details after sync jobs 2018-12-04 13:00:23 -05:00
kialam
2a62e300a2 UI update to check recent job type for routing to detail pages. 2018-12-04 11:19:25 -05:00
softwarefactory-project-zuul[bot]
f6b075843e Merge pull request #2845 from marshmalien/fix-org-ig-modal
Fix instance group modal selection

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-04 15:42:49 +00:00
Marliana Lara
4723773354 Fix instance group modal selection 2018-12-04 10:15:38 -05:00
softwarefactory-project-zuul[bot]
70be95cec5 Merge pull request #2861 from wenottingham/tighten-up-the-slack
Fix tooltip for slack channel list to note '#' is required.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-04 04:23:20 +00:00
softwarefactory-project-zuul[bot]
201b17012d Merge pull request #2865 from ansible/output-search-docslink
point output search doc link to latest

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-04 04:17:57 +00:00
Jake McDermott
47264b0809 point output search doc link to latest 2018-12-03 17:09:00 -05:00
Bill Nottingham
c51f235fab Fix tooltip for slack channel list to note '#' is required. 2018-12-03 14:22:59 -05:00
Bill Nottingham
f1b1224a27 Update foreman.py from Ansible devel, primarily for unicode fixes. 2018-12-03 10:23:28 -05:00
softwarefactory-project-zuul[bot]
63b0796738 Merge pull request #2852 from jlmitch5/updateOrgCardsCountWhenDatasetChanges
update org cards count when dataset chnages

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-02 20:05:19 +00:00
softwarefactory-project-zuul[bot]
5961e3ef2e Merge pull request #2846 from kialam/fix-3016-missing-job-events
Fix 3016 missing job events

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-12-01 05:12:05 +00:00
softwarefactory-project-zuul[bot]
246d80f177 Merge pull request #2850 from jlmitch5/addUserTokenPagination
add pagination to user tokens list

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-11-30 21:29:10 +00:00
softwarefactory-project-zuul[bot]
8005b47c14 Merge pull request #2849 from ryanpetrello/fix-custom-cred-encryption-nit
allow encrypted fields in custom credentials to be empty

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2018-11-30 21:07:30 +00:00
John Mitchell
1317572979 update org cards count when dataset chnages 2018-11-30 15:54:15 -05:00
AlanCoding
b763c51f8a add type to recent_jobs 2018-11-30 15:16:09 -05:00
softwarefactory-project-zuul[bot]
e70055a333 Merge pull request #2848 from wenottingham/more-fields-for-the-fields-god
Add timeout & slice count to the job field whitelist.

Reviewed-by: Bill Nottingham
             https://github.com/wenottingham
2018-11-30 20:02:38 +00:00
John Mitchell
52f86a206a add pagination to user tokens list 2018-11-30 14:27:23 -05:00
Ryan Petrello
7252883094 allow encrypted fields in custom credentials to be empty 2018-11-30 14:07:56 -05:00
kialam
afa7c2d69f Update unit tests. 2018-11-30 13:58:13 -05:00
Bill Nottingham
9c44d1f526 Add timeout & slice count to the job field whitelist. 2018-11-30 13:43:21 -05:00
kialam
473ce95c86 Fix failing unit tests. 2018-11-30 12:16:28 -05:00
kialam
3e1e068013 Add boundary checks for getReadyCount method. 2018-11-30 12:03:26 -05:00
kialam
746a154f2b Address missing job events.
- Fix off by one error.
- Add unit tests for Stream Service.
2018-11-30 11:23:15 -05:00
Ryan Petrello
d5c6c589b2 add an AWX_ISOLATED_VERBOSITY setting for debugging isolated connections 2018-11-26 23:47:16 -05:00
Ryan Petrello
fc0a039097 collect isolated capacity using a cache plugin, not stdout parsing
reading capacity values using the jsonfile cache plugin is more robust
in scenarios where ansible-playbook may print non-JSON output (such as
-vvv or when a custom callback plugin like timer is enabled)
2018-11-26 17:08:42 -05:00
475 changed files with 4570 additions and 4612 deletions

1
.gitignore vendored
View File

@@ -62,6 +62,7 @@ __pycache__
# UI build flag files
awx/ui/.deps_built
awx/ui/.release_built
awx/ui/.release_deps_built
# Testing
.cache

View File

@@ -18,7 +18,7 @@ $ pip install --upgrade ansible-tower-cli
The AWX host URL, user, and password must be set for the AWX instance to be exported:
```
$ tower-cli config host <old-awx-host.example.com>
$ tower-cli config host http://<old-awx-host.example.com>
$ tower-cli config username <user>
$ tower-cli config password <pass>
```
@@ -62,7 +62,7 @@ For other install methods, refer to the [Install.md](https://github.com/ansible/
Configure tower-cli for your new AWX host as shown earlier. Import from a JSON file named assets.json
```
$ tower-cli config host <new-awx-host.example.com>
$ tower-cli config host http://<new-awx-host.example.com>
$ tower-cli config username <user>
$ tower-cli config password <pass>
$ tower-cli send assets.json

View File

@@ -1,4 +1,4 @@
PYTHON ?= python
PYTHON ?= python3
PYTHON_VERSION = $(shell $(PYTHON) -c "from distutils.sysconfig import get_python_version; print(get_python_version())")
SITELIB=$(shell $(PYTHON) -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())")
OFFICIAL ?= no
@@ -53,6 +53,7 @@ WHEEL_FILE ?= $(WHEEL_NAME)-py2-none-any.whl
# UI flag files
UI_DEPS_FLAG_FILE = awx/ui/.deps_built
UI_RELEASE_DEPS_FLAG_FILE = awx/ui/.release_deps_built
UI_RELEASE_FLAG_FILE = awx/ui/.release_built
I18N_FLAG_FILE = .i18n_built
@@ -73,6 +74,7 @@ clean-ui:
rm -rf awx/ui/test/e2e/reports/
rm -rf awx/ui/client/languages/
rm -f $(UI_DEPS_FLAG_FILE)
rm -f $(UI_RELEASE_DEPS_FLAG_FILE)
rm -f $(UI_RELEASE_FLAG_FILE)
clean-tmp:
@@ -120,23 +122,30 @@ virtualenv_ansible:
mkdir $(VENV_BASE); \
fi; \
if [ ! -d "$(VENV_BASE)/ansible" ]; then \
virtualenv --system-site-packages $(VENV_BASE)/ansible && \
virtualenv -p python --system-site-packages $(VENV_BASE)/ansible && \
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed six packaging appdirs && \
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed setuptools==36.0.1 && \
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed pip==9.0.1; \
fi; \
fi
virtualenv_ansible_py3:
if [ "$(VENV_BASE)" ]; then \
if [ ! -d "$(VENV_BASE)" ]; then \
mkdir $(VENV_BASE); \
fi; \
if [ ! -d "$(VENV_BASE)/ansible3" ]; then \
python3 -m venv --system-site-packages $(VENV_BASE)/ansible3; \
fi; \
fi
virtualenv_awx:
if [ "$(VENV_BASE)" ]; then \
if [ ! -d "$(VENV_BASE)" ]; then \
mkdir $(VENV_BASE); \
fi; \
if [ ! -d "$(VENV_BASE)/awx" ]; then \
virtualenv --system-site-packages $(VENV_BASE)/awx && \
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed six packaging appdirs && \
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed setuptools==36.0.1 && \
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed pip==9.0.1; \
$(PYTHON) -m venv $(VENV_BASE)/awx; \
fi; \
fi
@@ -148,6 +157,11 @@ requirements_ansible: virtualenv_ansible
fi
$(VENV_BASE)/ansible/bin/pip uninstall --yes -r requirements/requirements_ansible_uninstall.txt
requirements_ansible_py3: virtualenv_ansible_py3
cat requirements/requirements_ansible.txt requirements/requirements_ansible_git.txt | $(VENV_BASE)/ansible3/bin/pip3 install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) --ignore-installed -r /dev/stdin
$(VENV_BASE)/ansible3/bin/pip3 install ansible # can't inherit from system ansible, it's py2
$(VENV_BASE)/ansible3/bin/pip3 uninstall --yes -r requirements/requirements_ansible_uninstall.txt
requirements_ansible_dev:
if [ "$(VENV_BASE)" ]; then \
$(VENV_BASE)/ansible/bin/pip install pytest mock; \
@@ -155,11 +169,9 @@ requirements_ansible_dev:
requirements_isolated:
if [ ! -d "$(VENV_BASE)/awx" ]; then \
virtualenv --system-site-packages $(VENV_BASE)/awx && \
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed six packaging appdirs && \
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed setuptools==35.0.2 && \
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed pip==9.0.1; \
$(PYTHON) -m venv $(VENV_BASE)/awx; \
fi;
echo "include-system-site-packages = true" >> $(VENV_BASE)/awx/lib/python$(PYTHON_VERSION)/pyvenv.cfg
$(VENV_BASE)/awx/bin/pip install -r requirements/requirements_isolated.txt
# Install third-party requirements needed for AWX's environment.
@@ -169,6 +181,7 @@ requirements_awx: virtualenv_awx
else \
cat requirements/requirements.txt requirements/requirements_git.txt | $(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) --ignore-installed -r /dev/stdin ; \
fi
echo "include-system-site-packages = true" >> $(VENV_BASE)/awx/lib/python$(PYTHON_VERSION)/pyvenv.cfg
#$(VENV_BASE)/awx/bin/pip uninstall --yes -r requirements/requirements_tower_uninstall.txt
requirements_awx_dev:
@@ -176,7 +189,7 @@ requirements_awx_dev:
requirements: requirements_ansible requirements_awx
requirements_dev: requirements requirements_awx_dev requirements_ansible_dev
requirements_dev: requirements requirements_ansible_py3 requirements_awx_dev requirements_ansible_dev
requirements_test: requirements
@@ -195,7 +208,7 @@ version_file:
if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
python -c "import awx as awx; print awx.__version__" > /var/lib/awx/.awx_version; \
python -c "import awx; print(awx.__version__)" > /var/lib/awx/.awx_version; \
# Do any one-time init tasks.
comma := ,
@@ -259,7 +272,7 @@ supervisor:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
supervisord --configuration /supervisor.conf --pidfile=/tmp/supervisor_pid
supervisord --pidfile=/tmp/supervisor_pid
# Alternate approach to tmux to run all development tasks specified in
# Procfile.
@@ -356,7 +369,7 @@ check: flake8 pep8 # pyflakes pylint
awx-link:
cp -R /tmp/awx.egg-info /awx_devel/ || true
sed -i "s/placeholder/$(shell git describe --long | sed 's/\./\\./g')/" /awx_devel/awx.egg-info/PKG-INFO
cp -f /tmp/awx.egg-link /venv/awx/lib/python2.7/site-packages/awx.egg-link
cp -f /tmp/awx.egg-link /venv/awx/lib/python$(PYTHON_VERSION)/site-packages/awx.egg-link
TEST_DIRS ?= awx/main/tests/unit awx/main/tests/functional awx/conf/tests awx/sso/tests
@@ -453,7 +466,7 @@ messages:
# generate l10n .json .mo
languages: $(I18N_FLAG_FILE)
$(I18N_FLAG_FILE): $(UI_DEPS_FLAG_FILE)
$(I18N_FLAG_FILE): $(UI_RELEASE_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run languages
$(PYTHON) tools/scripts/compilemessages.py
touch $(I18N_FLAG_FILE)
@@ -461,12 +474,30 @@ $(I18N_FLAG_FILE): $(UI_DEPS_FLAG_FILE)
# End l10n TASKS
# --------------------------------------
# UI TASKS
# UI RELEASE TASKS
# --------------------------------------
ui-release: $(UI_RELEASE_FLAG_FILE)
$(UI_RELEASE_FLAG_FILE): $(I18N_FLAG_FILE) $(UI_RELEASE_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run build-release
touch $(UI_RELEASE_FLAG_FILE)
$(UI_RELEASE_DEPS_FLAG_FILE):
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1 $(NPM_BIN) --unsafe-perm --prefix awx/ui install --no-save awx/ui
touch $(UI_RELEASE_DEPS_FLAG_FILE)
# END UI RELEASE TASKS
# --------------------------------------
# UI TASKS
# --------------------------------------
ui-deps: $(UI_DEPS_FLAG_FILE)
$(UI_DEPS_FLAG_FILE):
@if [ -f ${UI_RELEASE_DEPS_FLAG_FILE} ]; then \
rm -rf awx/ui/node_modules; \
rm -f ${UI_RELEASE_DEPS_FLAG_FILE}; \
fi; \
$(NPM_BIN) --unsafe-perm --prefix awx/ui install --no-save awx/ui
touch $(UI_DEPS_FLAG_FILE)
@@ -481,12 +512,6 @@ ui-docker: $(UI_DEPS_FLAG_FILE)
ui-devel: $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run build-devel -- $(MAKEFLAGS)
ui-release: $(UI_RELEASE_FLAG_FILE)
$(UI_RELEASE_FLAG_FILE): $(I18N_FLAG_FILE) $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run build-release
touch $(UI_RELEASE_FLAG_FILE)
ui-test: $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run test
@@ -497,9 +522,6 @@ ui-test-ci: $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui run test:ci
$(NPM_BIN) --prefix awx/ui run unit
testjs_ci:
echo "Update UI unittests later" #ui-test-ci
jshint: $(UI_DEPS_FLAG_FILE)
$(NPM_BIN) run --prefix awx/ui jshint
$(NPM_BIN) run --prefix awx/ui lint
@@ -547,7 +569,7 @@ docker-isolated:
TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml -f tools/docker-isolated-override.yml create
docker start tools_awx_1
docker start tools_isolated_1
echo "__version__ = '`git describe --long | cut -d - -f 1-1`'" | docker exec -i tools_isolated_1 /bin/bash -c "cat > /venv/awx/lib/python2.7/site-packages/awx.py"
echo "__version__ = '`git describe --long | cut -d - -f 1-1`'" | docker exec -i tools_isolated_1 /bin/bash -c "cat > /venv/awx/lib/python$(PYTHON_VERSION)/site-packages/awx.py"
CURRENT_UID=$(shell id -u) TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml -f tools/docker-isolated-override.yml up
# Docker Compose Development environment
@@ -574,7 +596,7 @@ docker-compose-detect-schema-change:
$(MAKE) docker-compose-genschema
curl https://s3.amazonaws.com/awx-public-ci-files/schema.json -o reference-schema.json
# Ignore differences in whitespace with -b
diff -u -b schema.json reference-schema.json
diff -u -b reference-schema.json schema.json
docker-compose-clean:
cd tools && CURRENT_UID=$(shell id -u) TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose run --rm -w /awx_devel --service-ports awx make clean

View File

@@ -1 +1 @@
2.1.0
3.0.0

View File

@@ -43,7 +43,7 @@ register(
help_text=_('Dictionary for customizing OAuth 2 timeouts, available items are '
'`ACCESS_TOKEN_EXPIRE_SECONDS`, the duration of access tokens in the number '
'of seconds, and `AUTHORIZATION_CODE_EXPIRE_SECONDS`, the duration of '
'authorization grants in the number of seconds.'),
'authorization codes in the number of seconds.'),
category=_('Authentication'),
category_slug='authentication',
)

View File

@@ -65,7 +65,7 @@ class TypeFilterBackend(BaseFilterBackend):
model = queryset.model
model_type = get_type_for_model(model)
if 'polymorphic_ctype' in get_all_field_names(model):
types_pks = set([v for k,v in types_map.items() if k in types])
types_pks = set([v for k, v in types_map.items() if k in types])
queryset = queryset.filter(polymorphic_ctype_id__in=types_pks)
elif model_type in types:
queryset = queryset
@@ -192,7 +192,7 @@ class FieldLookupBackend(BaseFilterBackend):
def value_to_python(self, model, lookup, value):
try:
lookup = lookup.encode("ascii")
lookup.encode("ascii")
except UnicodeEncodeError:
raise ValueError("%r is not an allowed field name. Must be ascii encodable." % lookup)
@@ -363,12 +363,12 @@ class FieldLookupBackend(BaseFilterBackend):
args.append(q)
if search_filters and search_filter_relation == 'OR':
q = Q()
for term, constrains in search_filters.iteritems():
for term, constrains in search_filters.items():
for constrain in constrains:
q |= Q(**{constrain: term})
args.append(q)
elif search_filters and search_filter_relation == 'AND':
for term, constrains in search_filters.iteritems():
for term, constrains in search_filters.items():
q_chain = Q()
for constrain in constrains:
q_chain |= Q(**{constrain: term})

View File

@@ -6,7 +6,7 @@ import inspect
import logging
import time
import six
import urllib
import urllib.parse
# Django
from django.conf import settings
@@ -90,9 +90,10 @@ class LoggedLoginView(auth_views.LoginView):
logger.info(smart_text(u"User {} logged in.".format(self.request.user.username)))
ret.set_cookie('userLoggedIn', 'true')
current_user = UserSerializer(self.request.user)
current_user = JSONRenderer().render(current_user.data)
current_user = urllib.quote('%s' % current_user, '')
current_user = smart_text(JSONRenderer().render(current_user.data))
current_user = urllib.parse.quote('%s' % current_user, '')
ret.set_cookie('current_user', current_user, secure=settings.SESSION_COOKIE_SECURE or None)
return ret
else:
ret.status_code = 401
@@ -304,7 +305,7 @@ class APIView(views.APIView):
# submitted data was rejected.
request_method = getattr(self, '_raw_data_request_method', None)
response_status = getattr(self, '_raw_data_response_status', 0)
if request_method in ('POST', 'PUT', 'PATCH') and response_status in xrange(400, 500):
if request_method in ('POST', 'PUT', 'PATCH') and response_status in range(400, 500):
return self.request.data.copy()
return data
@@ -347,7 +348,7 @@ class GenericAPIView(generics.GenericAPIView, APIView):
# form.
if hasattr(self, '_raw_data_form_marker'):
# Always remove read only fields from serializer.
for name, field in serializer.fields.items():
for name, field in list(serializer.fields.items()):
if getattr(field, 'read_only', None):
del serializer.fields[name]
serializer._data = self.update_raw_data(serializer.data)
@@ -747,7 +748,7 @@ class SubListAttachDetachAPIView(SubListCreateAttachDetachAPIView):
def update_raw_data(self, data):
request_method = getattr(self, '_raw_data_request_method', None)
response_status = getattr(self, '_raw_data_response_status', 0)
if request_method == 'POST' and response_status in xrange(400, 500):
if request_method == 'POST' and response_status in range(400, 500):
return super(SubListAttachDetachAPIView, self).update_raw_data(data)
return {'id': None}

View File

@@ -157,7 +157,7 @@ class Metadata(metadata.SimpleMetadata):
finally:
view.request = request
for field, meta in actions[method].items():
for field, meta in list(actions[method].items()):
if not isinstance(meta, dict):
continue

View File

@@ -5,6 +5,7 @@ import json
# Django
from django.conf import settings
from django.utils import six
from django.utils.encoding import smart_str
from django.utils.translation import ugettext_lazy as _
# Django REST Framework
@@ -25,7 +26,7 @@ class JSONParser(parsers.JSONParser):
encoding = parser_context.get('encoding', settings.DEFAULT_CHARSET)
try:
data = stream.read().decode(encoding)
data = smart_str(stream.read(), encoding=encoding)
if not data:
return {}
obj = json.loads(data, object_pairs_hook=OrderedDict)

View File

@@ -8,7 +8,7 @@ import logging
import operator
import re
import six
import urllib
import urllib.parse
from collections import OrderedDict
from datetime import timedelta
@@ -40,6 +40,7 @@ from rest_framework.utils.serializer_helpers import ReturnList
from polymorphic.models import PolymorphicModel
# AWX
from awx.main.access import get_user_capabilities
from awx.main.constants import (
SCHEDULEABLE_PROVIDERS,
ANSI_SGR_PATTERN,
@@ -49,7 +50,6 @@ from awx.main.constants import (
)
from awx.main.models import * # noqa
from awx.main.models.base import NEW_JOB_TYPE_CHOICES
from awx.main.access import get_user_capabilities
from awx.main.fields import ImplicitRoleField
from awx.main.utils import (
get_type_for_model, get_model_for_type, timestamp_apiformat,
@@ -203,11 +203,11 @@ class BaseSerializerMetaclass(serializers.SerializerMetaclass):
@staticmethod
def _is_list_of_strings(x):
return isinstance(x, (list, tuple)) and all([isinstance(y, basestring) for y in x])
return isinstance(x, (list, tuple)) and all([isinstance(y, str) for y in x])
@staticmethod
def _is_extra_kwargs(x):
return isinstance(x, dict) and all([isinstance(k, basestring) and isinstance(v, dict) for k,v in x.items()])
return isinstance(x, dict) and all([isinstance(k, str) and isinstance(v, dict) for k,v in x.items()])
@classmethod
def _update_meta(cls, base, meta, other=None):
@@ -259,9 +259,7 @@ class BaseSerializerMetaclass(serializers.SerializerMetaclass):
return super(BaseSerializerMetaclass, cls).__new__(cls, name, bases, attrs)
class BaseSerializer(serializers.ModelSerializer):
__metaclass__ = BaseSerializerMetaclass
class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetaclass):
class Meta:
fields = ('id', 'type', 'url', 'related', 'summary_fields', 'created',
@@ -284,7 +282,7 @@ class BaseSerializer(serializers.ModelSerializer):
# The following lines fix the problem of being able to pass JSON dict into PrimaryKeyRelatedField.
data = kwargs.get('data', False)
if data:
for field_name, field_instance in six.iteritems(self.fields):
for field_name, field_instance in self.fields.items():
if isinstance(field_instance, ManyRelatedField) and not field_instance.read_only:
if isinstance(data.get(field_name, False), dict):
raise serializers.ValidationError(_('Cannot use dictionary for %s' % field_name))
@@ -294,7 +292,7 @@ class BaseSerializer(serializers.ModelSerializer):
"""
The request version component of the URL as an integer i.e., 1 or 2
"""
return get_request_version(self.context.get('request'))
return get_request_version(self.context.get('request')) or 1
def get_type(self, obj):
return get_type_for_model(self.Meta.model)
@@ -612,7 +610,7 @@ class BaseSerializer(serializers.ModelSerializer):
v2.extend(e)
else:
v2.append(e)
d[k] = map(force_text, v2)
d[k] = list(map(force_text, v2))
raise ValidationError(d)
return attrs
@@ -632,9 +630,7 @@ class EmptySerializer(serializers.Serializer):
pass
class BaseFactSerializer(BaseSerializer):
__metaclass__ = BaseSerializerMetaclass
class BaseFactSerializer(BaseSerializer, metaclass=BaseSerializerMetaclass):
def get_fields(self):
ret = super(BaseFactSerializer, self).get_fields()
@@ -2139,10 +2135,10 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
return attrs.get(fd, self.instance and getattr(self.instance, fd) or None)
if get_field_from_model_or_attrs('source') != 'scm':
redundant_scm_fields = filter(
redundant_scm_fields = list(filter(
lambda x: attrs.get(x, None),
['source_project', 'source_path', 'update_on_project_update']
)
))
if redundant_scm_fields:
raise serializers.ValidationError(
{"detail": _("Cannot set %s if not SCM type." % ' '.join(redundant_scm_fields))}
@@ -2465,17 +2461,17 @@ class CredentialTypeSerializer(BaseSerializer):
field['help_text'] = _(field['help_text'])
if field['type'] == 'become_method':
field.pop('type')
field['choices'] = map(operator.itemgetter(0), CHOICES_PRIVILEGE_ESCALATION_METHODS)
field['choices'] = list(map(operator.itemgetter(0), CHOICES_PRIVILEGE_ESCALATION_METHODS))
return value
def filter_field_metadata(self, fields, method):
# API-created/modified CredentialType kinds are limited to
# `cloud` and `net`
if method in ('PUT', 'POST'):
fields['kind']['choices'] = filter(
fields['kind']['choices'] = list(filter(
lambda choice: choice[0] in ('cloud', 'net'),
fields['kind']['choices']
)
))
return fields
@@ -2626,8 +2622,8 @@ class CredentialSerializer(BaseSerializer):
raise serializers.ValidationError({"kind": _('"%s" is not a valid choice' % kind)})
data['credential_type'] = credential_type.pk
value = OrderedDict(
{'credential_type': credential_type}.items() +
super(CredentialSerializer, self).to_internal_value(data).items()
list({'credential_type': credential_type}.items()) +
list(super(CredentialSerializer, self).to_internal_value(data).items())
)
# Make a set of the keys in the POST/PUT payload
@@ -2983,12 +2979,16 @@ class JobTemplateMixin(object):
'''
def _recent_jobs(self, obj):
if hasattr(obj, 'workflow_jobs'):
job_mgr = obj.workflow_jobs
else:
job_mgr = obj.jobs
return [{'id': x.id, 'status': x.status, 'finished': x.finished}
for x in job_mgr.all().order_by('-created')[:10]]
# Exclude "joblets", jobs that ran as part of a sliced workflow job
uj_qs = obj.unifiedjob_unified_jobs.exclude(job__job_slice_count__gt=1).order_by('-created')
# Would like to apply an .only, but does not play well with non_polymorphic
# .only('id', 'status', 'finished', 'polymorphic_ctype_id')
optimized_qs = uj_qs.non_polymorphic()
return [{
'id': x.id, 'status': x.status, 'finished': x.finished,
# Make type consistent with API top-level key, for instance workflow_job
'type': x.get_real_instance_class()._meta.verbose_name.replace(' ', '_')
} for x in optimized_qs[:10]]
def get_summary_fields(self, obj):
d = super(JobTemplateMixin, self).get_summary_fields(obj)
@@ -3483,12 +3483,16 @@ class AdHocCommandSerializer(UnifiedJobSerializer):
ret['name'] = obj.module_name
return ret
def validate(self, attrs):
ret = super(AdHocCommandSerializer, self).validate(attrs)
return ret
def validate_extra_vars(self, value):
redacted_extra_vars, removed_vars = extract_ansible_vars(value)
if removed_vars:
raise serializers.ValidationError(_(
"{} are prohibited from use in ad hoc commands."
).format(", ".join(removed_vars)))
).format(", ".join(sorted(removed_vars, reverse=True))))
return vars_validate_or_raise(value)
@@ -3716,7 +3720,7 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
for field in self.instance._meta.fields:
setattr(mock_obj, field.name, getattr(self.instance, field.name))
field_names = set(field.name for field in self.Meta.model._meta.fields)
for field_name, value in attrs.items():
for field_name, value in list(attrs.items()):
setattr(mock_obj, field_name, value)
if field_name not in field_names:
attrs.pop(field_name)
@@ -4330,7 +4334,7 @@ class JobLaunchSerializer(BaseSerializer):
passwords_needed=cred.passwords_needed
)
if cred.credential_type.managed_by_tower and 'vault_id' in cred.credential_type.defined_fields:
cred_dict['vault_id'] = cred.inputs.get('vault_id') or None
cred_dict['vault_id'] = cred.get_input('vault_id', default=None)
defaults_dict.setdefault(field_name, []).append(cred_dict)
else:
defaults_dict[field_name] = getattr(obj, field_name)
@@ -4486,11 +4490,11 @@ class NotificationTemplateSerializer(BaseSerializer):
model = NotificationTemplate
fields = ('*', 'organization', 'notification_type', 'notification_configuration')
type_map = {"string": (str, unicode),
type_map = {"string": (str,),
"int": (int,),
"bool": (bool,),
"list": (list,),
"password": (str, unicode),
"password": (str,),
"object": (dict, OrderedDict)}
def to_representation(self, obj):
@@ -4873,7 +4877,7 @@ class ActivityStreamSerializer(BaseSerializer):
for key in summary_dict.keys():
if 'id' not in summary_dict[key]:
summary_dict[key] = summary_dict[key] + ('id',)
field_list = summary_dict.items()
field_list = list(summary_dict.items())
# Needed related fields that are not in the default summary fields
field_list += [
('workflow_job_template_node', ('id', 'unified_job_template_id')),
@@ -4893,7 +4897,7 @@ class ActivityStreamSerializer(BaseSerializer):
def get_fields(self):
ret = super(ActivityStreamSerializer, self).get_fields()
for key, field in ret.items():
for key, field in list(ret.items()):
if key == 'changes':
field.help_text = _('A summary of the new and changed values when an object is created, updated, or deleted')
if key == 'object1':
@@ -4934,10 +4938,6 @@ class ActivityStreamSerializer(BaseSerializer):
def get_related(self, obj):
rel = {}
VIEW_NAME_EXCEPTIONS = {
'custom_inventory_script': 'inventory_script_detail',
'o_auth2_access_token': 'o_auth2_token_detail'
}
if obj.actor is not None:
rel['actor'] = self.reverse('api:user_detail', kwargs={'pk': obj.actor.pk})
for fk, __ in self._local_summarizable_fk_fields:
@@ -4951,11 +4951,12 @@ class ActivityStreamSerializer(BaseSerializer):
if getattr(thisItem, 'id', None) in id_list:
continue
id_list.append(getattr(thisItem, 'id', None))
if fk in VIEW_NAME_EXCEPTIONS:
view_name = VIEW_NAME_EXCEPTIONS[fk]
if hasattr(thisItem, 'get_absolute_url'):
rel_url = thisItem.get_absolute_url(self.context.get('request'))
else:
view_name = fk + '_detail'
rel[fk].append(self.reverse('api:' + view_name, kwargs={'pk': thisItem.id}))
rel_url = self.reverse('api:' + view_name, kwargs={'pk': thisItem.id})
rel[fk].append(rel_url)
if fk == 'schedule':
rel['unified_job_template'] = thisItem.unified_job_template.get_absolute_url(self.context.get('request'))
@@ -5038,7 +5039,7 @@ class FactVersionSerializer(BaseFactSerializer):
}
res['fact_view'] = '%s?%s' % (
reverse('api:host_fact_compare_view', kwargs={'pk': obj.host.pk}, request=self.context.get('request')),
urllib.urlencode(params)
urllib.parse.urlencode(params)
)
return res

View File

@@ -18,7 +18,7 @@ import six
# Django
from django.conf import settings
from django.core.exceptions import FieldError, ObjectDoesNotExist
from django.db.models import Q
from django.db.models import Q, Sum
from django.db import IntegrityError, transaction, connection
from django.shortcuts import get_object_or_404
from django.utils.safestring import mark_safe
@@ -70,7 +70,6 @@ from awx.main.models import * # noqa
from awx.main.utils import * # noqa
from awx.main.utils import (
extract_ansible_vars,
decrypt_field,
)
from awx.main.utils.encryption import encrypt_value
from awx.main.utils.filters import SmartFilter
@@ -173,7 +172,7 @@ class DashboardView(APIView):
user_inventory = get_user_queryset(request.user, Inventory)
inventory_with_failed_hosts = user_inventory.filter(hosts_with_active_failures__gt=0)
user_inventory_external = user_inventory.filter(has_inventory_sources=True)
failed_inventory = sum(i.inventory_sources_with_failures for i in user_inventory)
failed_inventory = user_inventory.aggregate(Sum('inventory_sources_with_failures'))['inventory_sources_with_failures__sum']
data['inventories'] = {'url': reverse('api:inventory_list', request=request),
'total': user_inventory.count(),
'total_with_inventory_source': user_inventory_external.count(),
@@ -517,7 +516,7 @@ class AuthView(APIView):
from rest_framework.reverse import reverse
data = OrderedDict()
err_backend, err_message = request.session.get('social_auth_error', (None, None))
auth_backends = load_backends(settings.AUTHENTICATION_BACKENDS, force_load=True).items()
auth_backends = list(load_backends(settings.AUTHENTICATION_BACKENDS, force_load=True).items())
# Return auth backends in consistent order: Google, GitHub, SAML.
auth_backends.sort(key=lambda x: 'g' if x[0] == 'google-oauth2' else x[0])
for name, backend in auth_backends:
@@ -1592,7 +1591,7 @@ class HostInsights(GenericAPIView):
serializer_class = EmptySerializer
def _extract_insights_creds(self, credential):
return (credential.inputs['username'], decrypt_field(credential, 'password'))
return (credential.get_input('username', default=''), credential.get_input('password', default=''))
def _get_insights(self, url, username, password):
session = requests.Session()
@@ -2308,7 +2307,7 @@ class JobTemplateLaunch(RetrieveAPIView):
raise ParseError({key: [msg], 'credentials': [msg]})
# add the deprecated credential specified in the request
if not isinstance(prompted_value, Iterable) or isinstance(prompted_value, basestring):
if not isinstance(prompted_value, Iterable) or isinstance(prompted_value, str):
prompted_value = [prompted_value]
# If user gave extra_credentials, special case to use exactly
@@ -3154,9 +3153,10 @@ class WorkflowJobRelaunch(WorkflowsEnforcementMixin, GenericAPIView):
def post(self, request, *args, **kwargs):
obj = self.get_object()
if obj.is_sliced_job:
if not obj.job_template_id:
jt = obj.job_template
if not jt:
raise ParseError(_('Cannot relaunch slice workflow job orphaned from job template.'))
elif obj.job_template.job_slice_count != obj.workflow_nodes.count():
elif not jt.inventory or min(jt.inventory.hosts.count(), jt.job_slice_count) != obj.workflow_nodes.count():
raise ParseError(_('Cannot relaunch sliced workflow job after slice count has changed.'))
new_workflow_job = obj.create_relaunch_workflow_job()
new_workflow_job.signal_start()
@@ -4458,7 +4458,7 @@ class RoleChildrenList(SubListAPIView):
# in URL patterns and reverse URL lookups, converting CamelCase names to
# lowercase_with_underscore (e.g. MyView.as_view() becomes my_view).
this_module = sys.modules[__name__]
for attr, value in locals().items():
for attr, value in list(locals().items()):
if isinstance(value, type) and issubclass(value, APIView):
name = camelcase_to_underscore(attr)
view = value.as_view()

View File

@@ -2,6 +2,7 @@
# All Rights Reserved.
import logging
import operator
import json
from collections import OrderedDict
@@ -161,7 +162,7 @@ class ApiV1PingView(APIView):
for instance in Instance.objects.all():
response['instances'].append(dict(node=instance.hostname, heartbeat=instance.modified,
capacity=instance.capacity, version=instance.version))
response['instances'].sort()
sorted(response['instances'], key=operator.itemgetter('node'))
response['instance_groups'] = []
for instance_group in InstanceGroup.objects.all():
response['instance_groups'].append(dict(name=instance_group.name,

View File

@@ -1,6 +1,6 @@
# Python
import logging
import urlparse
import urllib.parse as urlparse
from collections import OrderedDict
# Django
@@ -71,7 +71,7 @@ class StringListBooleanField(ListField):
return False
elif value in NullBooleanField.NULL_VALUES:
return None
elif isinstance(value, basestring):
elif isinstance(value, str):
return self.child.to_representation(value)
except TypeError:
pass
@@ -88,7 +88,7 @@ class StringListBooleanField(ListField):
return False
elif data in NullBooleanField.NULL_VALUES:
return None
elif isinstance(data, basestring):
elif isinstance(data, str):
return self.child.run_validation(data)
except TypeError:
pass

View File

@@ -460,10 +460,10 @@ class Command(BaseCommand):
elif file_to_comment not in to_comment_patterns:
to_comment_patterns.append(file_to_comment)
# Run once in dry-run mode to catch any errors from updating the files.
diffs = comment_assignments(to_comment_patterns, to_comment.keys(), dry_run=True, backup_suffix=self.backup_suffix)
diffs = comment_assignments(to_comment_patterns, list(to_comment.keys()), dry_run=True, backup_suffix=self.backup_suffix)
# Then, if really updating, run again.
if not self.dry_run and not self.no_comment:
diffs = comment_assignments(to_comment_patterns, to_comment.keys(), dry_run=False, backup_suffix=self.backup_suffix)
diffs = comment_assignments(to_comment_patterns, list(to_comment.keys()), dry_run=False, backup_suffix=self.backup_suffix)
if license_file_to_comment:
diffs.extend(self._comment_license_file(dry_run=False))
if local_settings_file_to_comment:

View File

@@ -33,7 +33,7 @@ class Setting(CreatedModifiedModel):
on_delete=models.CASCADE,
))
def __unicode__(self):
def __str__(self):
try:
json_value = json.dumps(self.value)
except ValueError:

View File

@@ -6,18 +6,17 @@ import re
import sys
import threading
import time
import StringIO
import traceback
import urllib
import six
import urllib.parse
from io import StringIO
# Django
from django.conf import LazySettings
from django.conf import settings, UserSettingsHolder
from django.core.cache import cache as django_cache
from django.core.exceptions import ImproperlyConfigured
from django.db import ProgrammingError, OperationalError, transaction, connection
from django.db import transaction, connection
from django.db.utils import Error as DBError
from django.utils.functional import cached_property
# Django REST Framework
@@ -67,7 +66,7 @@ def normalize_broker_url(value):
match = re.search('(amqp://[^:]+:)(.*)', parts[0])
if match:
prefix, password = match.group(1), match.group(2)
parts[0] = prefix + urllib.quote(password)
parts[0] = prefix + urllib.parse.quote(password)
return '@'.join(parts)
@@ -90,21 +89,21 @@ def _ctit_db_wrapper(trans_safe=False):
logger.debug('Obtaining database settings in spite of broken transaction.')
transaction.set_rollback(False)
yield
except (ProgrammingError, OperationalError):
except DBError:
if 'migrate' in sys.argv and get_tower_migration_version() < '310':
logger.info('Using default settings until version 3.1 migration.')
else:
# We want the _full_ traceback with the context
# First we get the current call stack, which constitutes the "top",
# it has the context up to the point where the context manager is used
top_stack = StringIO.StringIO()
top_stack = StringIO()
traceback.print_stack(file=top_stack)
top_lines = top_stack.getvalue().strip('\n').split('\n')
top_stack.close()
# Get "bottom" stack from the local error that happened
# inside of the "with" block this wraps
exc_type, exc_value, exc_traceback = sys.exc_info()
bottom_stack = StringIO.StringIO()
bottom_stack = StringIO()
traceback.print_tb(exc_traceback, file=bottom_stack)
bottom_lines = bottom_stack.getvalue().strip('\n').split('\n')
# Glue together top and bottom where overlap is found
@@ -168,15 +167,6 @@ class EncryptedCacheProxy(object):
def get(self, key, **kwargs):
value = self.cache.get(key, **kwargs)
value = self._handle_encryption(self.decrypter, key, value)
# python-memcached auto-encodes unicode on cache set in python2
# https://github.com/linsomniac/python-memcached/issues/79
# https://github.com/linsomniac/python-memcached/blob/288c159720eebcdf667727a859ef341f1e908308/memcache.py#L961
if six.PY2 and isinstance(value, six.binary_type):
try:
six.text_type(value)
except UnicodeDecodeError:
value = value.decode('utf-8')
logger.debug('cache get(%r, %r) -> %r', key, empty, filter_sensitive(self.registry, key, value))
return value

View File

@@ -9,15 +9,11 @@ from django.core.cache import cache
from django.dispatch import receiver
# Tower
import awx.main.signals
from awx.conf import settings_registry
from awx.conf.models import Setting
from awx.conf.serializers import SettingSerializer
logger = logging.getLogger('awx.conf.signals')
awx.main.signals.model_serializer_mapping[Setting] = SettingSerializer
__all__ = []

View File

@@ -1,5 +1,5 @@
import pytest
import mock
from unittest import mock
from rest_framework import serializers

View File

@@ -1,38 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
import pytest
import mock
from django.apps import apps
from awx.conf.migrations._reencrypt import (
replace_aesecb_fernet,
encrypt_field,
decrypt_field,
)
from awx.conf.settings import Setting
from awx.main.utils import decrypt_field as new_decrypt_field
@pytest.mark.django_db
@pytest.mark.parametrize("old_enc, new_enc, value", [
('$encrypted$UTF8$AES', '$encrypted$UTF8$AESCBC$', u'Iñtërnâtiônàlizætiøn'),
('$encrypted$AES$', '$encrypted$AESCBC$', 'test'),
])
def test_settings(old_enc, new_enc, value):
with mock.patch('awx.conf.models.encrypt_field', encrypt_field):
with mock.patch('awx.conf.settings.decrypt_field', decrypt_field):
setting = Setting.objects.create(key='SOCIAL_AUTH_GITHUB_SECRET', value=value)
assert setting.value.startswith(old_enc)
replace_aesecb_fernet(apps, None)
setting.refresh_from_db()
assert setting.value.startswith(new_enc)
assert new_decrypt_field(setting, 'value') == value
# This is here for a side-effect.
# Exception if the encryption type of AESCBC is not properly skipped, ensures
# our `startswith` calls don't have typos
replace_aesecb_fernet(apps, None)

View File

@@ -4,6 +4,7 @@
# All Rights Reserved.
from contextlib import contextmanager
import codecs
from uuid import uuid4
import time
@@ -67,7 +68,7 @@ def test_cached_settings_unicode_is_auto_decoded(settings):
# https://github.com/linsomniac/python-memcached/issues/79
# https://github.com/linsomniac/python-memcached/blob/288c159720eebcdf667727a859ef341f1e908308/memcache.py#L961
value = six.u('Iñtërnâtiônàlizætiøn').encode('utf-8') # this simulates what python-memcached does on cache.set()
value = 'Iñtërnâtiônàlizætiøn' # this simulates what python-memcached does on cache.set()
settings.cache.set('DEBUG', value)
assert settings.cache.get('DEBUG') == six.u('Iñtërnâtiônàlizætiøn')
@@ -262,7 +263,7 @@ def test_setting_from_db_with_unicode(settings, mocker, encrypted):
encrypted=encrypted
)
# this simulates a bug in python-memcached; see https://github.com/linsomniac/python-memcached/issues/79
value = six.u('Iñtërnâtiônàlizætiøn').encode('utf-8')
value = 'Iñtërnâtiônàlizætiøn'
setting_from_db = mocker.Mock(id=1, key='AWX_SOME_SETTING', value=value)
mocks = mocker.Mock(**{
@@ -272,8 +273,8 @@ def test_setting_from_db_with_unicode(settings, mocker, encrypted):
}),
})
with mocker.patch('awx.conf.models.Setting.objects.filter', return_value=mocks):
assert settings.AWX_SOME_SETTING == six.u('Iñtërnâtiônàlizætiøn')
assert settings.cache.get('AWX_SOME_SETTING') == six.u('Iñtërnâtiônàlizætiøn')
assert settings.AWX_SOME_SETTING == 'Iñtërnâtiônàlizætiøn'
assert settings.cache.get('AWX_SOME_SETTING') == 'Iñtërnâtiônàlizætiøn'
@pytest.mark.defined_in_file(AWX_SOME_SETTING='DEFAULT')
@@ -434,7 +435,7 @@ def test_sensitive_cache_data_is_encrypted(settings, mocker):
def rot13(obj, attribute):
assert obj.pk == 123
return getattr(obj, attribute).encode('rot13')
return codecs.encode(getattr(obj, attribute), 'rot_13')
native_cache = LocMemCache(str(uuid4()), {})
cache = EncryptedCacheProxy(
@@ -471,7 +472,7 @@ def test_readonly_sensitive_cache_data_is_encrypted(settings):
def rot13(obj, attribute):
assert obj.pk is None
return getattr(obj, attribute).encode('rot13')
return codecs.encode(getattr(obj, attribute), 'rot_13')
native_cache = LocMemCache(str(uuid4()), {})
cache = EncryptedCacheProxy(

View File

@@ -102,7 +102,7 @@ def comment_assignments_in_file(filename, assignment_names, dry_run=True, backup
if not dry_run:
if backup_filename:
shutil.copy2(filename, backup_filename)
with open(filename, 'wb') as fileobj:
with open(filename, 'w') as fileobj:
fileobj.write(new_file_data)
return '\n'.join(diff_lines)

View File

@@ -72,7 +72,7 @@ class SettingSingletonDetail(RetrieveUpdateDestroyAPIView):
def get_queryset(self):
self.category_slug = self.kwargs.get('category_slug', 'all')
all_category_slugs = settings_registry.get_registered_categories(features_enabled=get_licensed_features()).keys()
all_category_slugs = list(settings_registry.get_registered_categories(features_enabled=get_licensed_features()).keys())
for slug_to_delete in VERSION_SPECIFIC_CATEGORIES_TO_EXCLUDE[get_request_version(self.request)]:
all_category_slugs.remove(slug_to_delete)
if self.request.user.is_superuser or getattr(self.request.user, 'is_system_auditor', False):
@@ -123,7 +123,7 @@ class SettingSingletonDetail(RetrieveUpdateDestroyAPIView):
if key == 'LICENSE' or settings_registry.is_setting_read_only(key):
continue
if settings_registry.is_setting_encrypted(key) and \
isinstance(value, basestring) and \
isinstance(value, str) and \
value.startswith('$encrypted$'):
continue
setattr(serializer.instance, key, value)
@@ -210,7 +210,7 @@ class SettingLoggingTest(GenericAPIView):
# in URL patterns and reverse URL lookups, converting CamelCase names to
# lowercase_with_underscore (e.g. MyView.as_view() becomes my_view).
this_module = sys.modules[__name__]
for attr, value in locals().items():
for attr, value in list(locals().items()):
if isinstance(value, type) and issubclass(value, APIView):
name = camelcase_to_underscore(attr)
view = value.as_view()

View File

@@ -35,8 +35,6 @@ except ImportError:
os.environ['VIRTUAL_ENV']
))
from six.moves import xrange
__all__ = ['event_context']
@@ -56,9 +54,8 @@ class IsolatedFileWrite:
filename = '{}-partial.json'.format(event_uuid)
dropoff_location = os.path.join(self.private_data_dir, 'artifacts', 'job_events', filename)
write_location = '.'.join([dropoff_location, 'tmp'])
partial_data = json.dumps(value)
with os.fdopen(os.open(write_location, os.O_WRONLY | os.O_CREAT, stat.S_IRUSR | stat.S_IWUSR), 'w') as f:
f.write(partial_data)
f.write(value)
os.rename(write_location, dropoff_location)
@@ -154,7 +151,7 @@ class EventContext(object):
if event not in ('playbook_on_stats',) and "res" in event_data and len(str(event_data['res'])) > max_res:
event_data['res'] = {}
event_dict = dict(event=event, event_data=event_data)
for key in event_data.keys():
for key in list(event_data.keys()):
if key in ('job_id', 'ad_hoc_command_id', 'project_update_id', 'uuid', 'parent_uuid', 'created',):
event_dict[key] = event_data.pop(key)
elif key in ('verbosity', 'pid'):
@@ -165,11 +162,11 @@ class EventContext(object):
return {}
def dump(self, fileobj, data, max_width=78, flush=False):
b64data = base64.b64encode(json.dumps(data))
b64data = base64.b64encode(json.dumps(data).encode('utf-8')).decode()
with self.display_lock:
# pattern corresponding to OutputEventFilter expectation
fileobj.write(u'\x1b[K')
for offset in xrange(0, len(b64data), max_width):
for offset in range(0, len(b64data), max_width):
chunk = b64data[offset:offset + max_width]
escaped_chunk = u'{}\x1b[{}D'.format(chunk, len(chunk))
fileobj.write(escaped_chunk)
@@ -179,7 +176,7 @@ class EventContext(object):
def dump_begin(self, fileobj):
begin_dict = self.get_begin_dict()
self.cache.set(":1:ev-{}".format(begin_dict['uuid']), begin_dict)
self.cache.set(":1:ev-{}".format(begin_dict['uuid']), json.dumps(begin_dict))
self.dump(fileobj, {'uuid': begin_dict['uuid']})
def dump_end(self, fileobj):

View File

@@ -5,11 +5,11 @@ from __future__ import absolute_import
from collections import OrderedDict
import json
import mock
import os
import shutil
import sys
import tempfile
from unittest import mock
import pytest

View File

@@ -2835,7 +2835,7 @@ msgstr ""
#: awx/main/models/credential/__init__.py:979
msgid ""
"Enter the URL for the virtual machine that corresponds to your CloudForm "
"Enter the URL for the virtual machine that corresponds to your CloudForms "
"instance. For example, https://cloudforms.example.org"
msgstr ""

View File

@@ -2835,7 +2835,7 @@ msgstr ""
#: awx/main/models/credential/__init__.py:979
msgid ""
"Enter the URL for the virtual machine that corresponds to your CloudForm "
"Enter the URL for the virtual machine that corresponds to your CloudForms "
"instance. For example, https://cloudforms.example.org"
msgstr ""

View File

@@ -3086,7 +3086,7 @@ msgstr "URL CloudForms"
#: awx/main/models/credential/__init__.py:982
msgid ""
"Enter the URL for the virtual machine that corresponds to your CloudForm "
"Enter the URL for the virtual machine that corresponds to your CloudForms "
"instance. For example, https://cloudforms.example.org"
msgstr ""
"Introduzca la URL para la máquina virtual que corresponda a su instancia "

View File

@@ -3099,7 +3099,7 @@ msgstr "URL CloudForms"
#: awx/main/models/credential/__init__.py:982
msgid ""
"Enter the URL for the virtual machine that corresponds to your CloudForm "
"Enter the URL for the virtual machine that corresponds to your CloudForms "
"instance. For example, https://cloudforms.example.org"
msgstr ""
"Veuillez saisir lURL de la machine virtuelle qui correspond à votre "

View File

@@ -2858,7 +2858,7 @@ msgstr "CloudForms URL"
#: awx/main/models/credential/__init__.py:982
msgid ""
"Enter the URL for the virtual machine that corresponds to your CloudForm "
"Enter the URL for the virtual machine that corresponds to your CloudForms "
"instance. For example, https://cloudforms.example.org"
msgstr ""
"CloudForms インスタンスに対応する仮想マシンの URL を入力します (例: https://cloudforms.example.org)。"

View File

@@ -3072,7 +3072,7 @@ msgstr "CloudForms-URL"
#: awx/main/models/credential/__init__.py:982
msgid ""
"Enter the URL for the virtual machine that corresponds to your CloudForm "
"Enter the URL for the virtual machine that corresponds to your CloudForms "
"instance. For example, https://cloudforms.example.org"
msgstr ""
"Voer de URL in voor de virtuele machine die overeenkomt met uw CloudForm-"

View File

@@ -1276,6 +1276,7 @@ class JobTemplateAccess(BaseAccess):
'instance_groups',
'credentials__credential_type',
Prefetch('labels', queryset=Label.objects.all().order_by('name')),
Prefetch('last_job', queryset=UnifiedJob.objects.non_polymorphic()),
)
def filtered_queryset(self):
@@ -1389,13 +1390,15 @@ class JobTemplateAccess(BaseAccess):
'job_tags', 'force_handlers', 'skip_tags', 'ask_variables_on_launch',
'ask_tags_on_launch', 'ask_job_type_on_launch', 'ask_skip_tags_on_launch',
'ask_inventory_on_launch', 'ask_credential_on_launch', 'survey_enabled',
'custom_virtualenv', 'diff_mode',
'custom_virtualenv', 'diff_mode', 'timeout', 'job_slice_count',
# These fields are ignored, but it is convenient for QA to allow clients to post them
'last_job_run', 'created', 'modified',
]
for k, v in data.items():
if k not in [x.name for x in obj._meta.concrete_fields]:
continue
if hasattr(obj, k) and getattr(obj, k) != v:
if k not in field_whitelist and v != getattr(obj, '%s_id' % k, None) \
and not (hasattr(obj, '%s_id' % k) and getattr(obj, '%s_id' % k) is None and v == ''): # Equate '' to None in the case of foreign keys

View File

@@ -197,6 +197,18 @@ register(
category_slug='jobs',
)
register(
'AWX_ISOLATED_VERBOSITY',
field_class=fields.IntegerField,
min_value=0,
max_value=5,
label=_('Verbosity level for isolated node management tasks'),
help_text=_('This can be raised to aid in debugging connection issues for isolated task execution'),
category=_('Jobs'),
category_slug='jobs',
default=0
)
register(
'AWX_ISOLATED_CHECK_INTERVAL',
field_class=fields.IntegerField,

View File

@@ -4,6 +4,7 @@ import logging
from channels import Group
from channels.auth import channel_session_user_from_http, channel_session_user
from django.utils.encoding import smart_str
from django.http.cookie import parse_cookie
from django.core.serializers.json import DjangoJSONEncoder
@@ -30,7 +31,7 @@ def ws_connect(message):
# store the valid CSRF token from the cookie so we can compare it later
# on ws_receive
cookie_token = parse_cookie(
headers.get('cookie')
smart_str(headers.get(b'cookie'))
).get('csrftoken')
if cookie_token:
message.channel_session[XRF_KEY] = cookie_token

View File

@@ -2,4 +2,4 @@ from django.conf import settings
def get_local_queuename():
return settings.CLUSTER_HOST_ID.encode('utf-8')
return settings.CLUSTER_HOST_ID

View File

@@ -8,7 +8,7 @@ from uuid import uuid4
import collections
from multiprocessing import Process
from multiprocessing import Queue as MPQueue
from Queue import Full as QueueFull, Empty as QueueEmpty
from queue import Full as QueueFull, Empty as QueueEmpty
from django.conf import settings
from django.db import connection as django_connection, connections
@@ -129,7 +129,7 @@ class PoolWorker(object):
# the task at [0] is the one that's running right now (or is about to
# be running)
if len(self.managed_tasks):
return self.managed_tasks[self.managed_tasks.keys()[0]]
return self.managed_tasks[list(self.managed_tasks.keys())[0]]
return None
@@ -180,7 +180,7 @@ class WorkerPool(object):
class MessagePrinter(awx.main.dispatch.worker.BaseWorker):
def perform_work(self, body):
print body
print(body)
pool = WorkerPool(min_workers=4) # spawn four worker processes
pool.init_workers(MessagePrint().work_loop)
@@ -253,7 +253,7 @@ class WorkerPool(object):
return tmpl.render(pool=self, workers=self.workers, meta=self.debug_meta)
def write(self, preferred_queue, body):
queue_order = sorted(range(len(self.workers)), cmp=lambda x, y: -1 if x==preferred_queue else 0)
queue_order = sorted(range(len(self.workers)), key=lambda x: -1 if x==preferred_queue else x)
write_attempt_order = []
for queue_actual in queue_order:
try:
@@ -365,7 +365,7 @@ class AutoscalePool(WorkerPool):
running_uuids = []
for worker in self.workers:
worker.calculate_managed_tasks()
running_uuids.extend(worker.managed_tasks.keys())
running_uuids.extend(list(worker.managed_tasks.keys()))
try:
reaper.reap(excluded_uuids=running_uuids)
except Exception:
@@ -373,6 +373,10 @@ class AutoscalePool(WorkerPool):
# don't use our logger (it accesses the database for configuration)
_, _, tb = sys.exc_info()
traceback.print_tb(tb)
for conn in connections.all():
# If the database connection has a hiccup, re-establish a new
# connection
conn.close_if_unusable_or_obsolete()
def up(self):
if self.full:

View File

@@ -45,7 +45,7 @@ class task:
@task(queue='tower_broadcast', exchange_type='fanout')
def announce():
print "Run this everywhere!"
print("Run this everywhere!")
"""
def __init__(self, queue=None, exchange_type=None):

View File

@@ -5,7 +5,7 @@ import os
import logging
import signal
from uuid import UUID
from Queue import Empty as QueueEmpty
from queue import Empty as QueueEmpty
from django import db
from kombu import Producer

View File

@@ -30,11 +30,18 @@ class TaskWorker(BaseWorker):
awx.main.tasks.delete_inventory
awx.main.tasks.RunProjectUpdate
'''
if not task.startswith('awx.'):
raise ValueError('{} is not a valid awx task'.format(task))
module, target = task.rsplit('.', 1)
module = importlib.import_module(module)
_call = None
if hasattr(module, target):
_call = getattr(module, target, None)
if not (
hasattr(_call, 'apply_async') and hasattr(_call, 'delay')
):
raise ValueError('{} is not decorated with @task()'.format(task))
return _call
def run_callable(self, body):
@@ -78,6 +85,7 @@ class TaskWorker(BaseWorker):
try:
result = self.run_callable(body)
except Exception as exc:
result = exc
try:
if getattr(exc, 'is_awx_task_error', False):

View File

@@ -1,6 +1,5 @@
import base64
import codecs
import StringIO
import json
import os
import shutil
@@ -9,8 +8,10 @@ import tempfile
import time
import logging
from distutils.version import LooseVersion as Version
from io import StringIO
from django.conf import settings
from django.utils.encoding import smart_bytes, smart_str
import awx
from awx.main.expect import run
@@ -101,6 +102,8 @@ class IsolatedManager(object):
]
if extra_vars:
args.extend(['-e', json.dumps(extra_vars)])
if settings.AWX_ISOLATED_VERBOSITY:
args.append('-%s' % ('v' * min(5, settings.AWX_ISOLATED_VERBOSITY)))
return args
@staticmethod
@@ -142,7 +145,7 @@ class IsolatedManager(object):
# if an ssh private key fifo exists, read its contents and delete it
if self.ssh_key_path:
buff = StringIO.StringIO()
buff = StringIO()
with open(self.ssh_key_path, 'r') as fifo:
for line in fifo:
buff.write(line)
@@ -154,7 +157,10 @@ class IsolatedManager(object):
# into a variable, and will replicate the data into a named pipe on the
# isolated instance
secrets_path = os.path.join(self.private_data_dir, 'env')
run.open_fifo_write(secrets_path, base64.b64encode(json.dumps(secrets)))
run.open_fifo_write(
secrets_path,
smart_str(base64.b64encode(smart_bytes(json.dumps(secrets))))
)
self.build_isolated_job_data()
@@ -174,7 +180,7 @@ class IsolatedManager(object):
args = self._build_args('run_isolated.yml', '%s,' % self.host, extra_vars)
if self.instance.verbosity:
args.append('-%s' % ('v' * min(5, self.instance.verbosity)))
buff = StringIO.StringIO()
buff = StringIO()
logger.debug('Starting job {} on isolated host with `run_isolated.yml` playbook.'.format(self.instance.id))
status, rc = IsolatedManager.run_pexpect(
args, self.awx_playbook_path(), self.management_env, buff,
@@ -244,7 +250,7 @@ class IsolatedManager(object):
os.makedirs(self.path_to('artifacts', 'job_events'), mode=stat.S_IXUSR + stat.S_IWUSR + stat.S_IRUSR)
def _missing_artifacts(self, path_list, output):
missing_artifacts = filter(lambda path: not os.path.exists(path), path_list)
missing_artifacts = list(filter(lambda path: not os.path.exists(path), path_list))
for path in missing_artifacts:
self.stdout_handle.write('ansible did not exit cleanly, missing `{}`.\n'.format(path))
if missing_artifacts:
@@ -282,7 +288,7 @@ class IsolatedManager(object):
status = 'failed'
output = ''
rc = None
buff = StringIO.StringIO()
buff = StringIO()
last_check = time.time()
seek = 0
job_timeout = remaining = self.job_timeout
@@ -303,7 +309,7 @@ class IsolatedManager(object):
time.sleep(1)
continue
buff = StringIO.StringIO()
buff = StringIO()
logger.debug('Checking on isolated job {} with `check_isolated.yml`.'.format(self.instance.id))
status, rc = IsolatedManager.run_pexpect(
args, self.awx_playbook_path(), self.management_env, buff,
@@ -318,7 +324,7 @@ class IsolatedManager(object):
path = self.path_to('artifacts', 'stdout')
if os.path.exists(path):
with open(path, 'r') as f:
with codecs.open(path, 'r', encoding='utf-8') as f:
f.seek(seek)
for line in f:
self.stdout_handle.write(line)
@@ -340,7 +346,7 @@ class IsolatedManager(object):
elif status == 'failed':
# if we were unable to retrieve job reults from the isolated host,
# print stdout of the `check_isolated.yml` playbook for clues
self.stdout_handle.write(output)
self.stdout_handle.write(smart_str(output))
return status, rc
@@ -355,7 +361,7 @@ class IsolatedManager(object):
}
args = self._build_args('clean_isolated.yml', '%s,' % self.host, extra_vars)
logger.debug('Cleaning up job {} on isolated host with `clean_isolated.yml` playbook.'.format(self.instance.id))
buff = StringIO.StringIO()
buff = StringIO()
timeout = max(60, 2 * settings.AWX_ISOLATED_CONNECTION_TIMEOUT)
status, rc = IsolatedManager.run_pexpect(
args, self.awx_playbook_path(), self.management_env, buff,
@@ -407,46 +413,52 @@ class IsolatedManager(object):
args = cls._build_args('heartbeat_isolated.yml', hostname_string)
args.extend(['--forks', str(len(instance_qs))])
env = cls._base_management_env()
env['ANSIBLE_STDOUT_CALLBACK'] = 'json'
buff = StringIO.StringIO()
timeout = max(60, 2 * settings.AWX_ISOLATED_CONNECTION_TIMEOUT)
status, rc = IsolatedManager.run_pexpect(
args, cls.awx_playbook_path(), env, buff,
idle_timeout=timeout, job_timeout=timeout,
pexpect_timeout=5
)
output = buff.getvalue().encode('utf-8')
buff.close()
try:
result = json.loads(output)
if not isinstance(result, dict):
raise TypeError('Expected a dict but received {}.'.format(str(type(result))))
except (ValueError, AssertionError, TypeError):
logger.exception('Failed to read status from isolated instances, output:\n {}'.format(output))
return
facts_path = tempfile.mkdtemp()
env['ANSIBLE_CACHE_PLUGIN'] = 'jsonfile'
env['ANSIBLE_CACHE_PLUGIN_CONNECTION'] = facts_path
for instance in instance_qs:
try:
task_result = result['plays'][0]['tasks'][0]['hosts'][instance.hostname]
except (KeyError, IndexError):
buff = StringIO()
timeout = max(60, 2 * settings.AWX_ISOLATED_CONNECTION_TIMEOUT)
status, rc = IsolatedManager.run_pexpect(
args, cls.awx_playbook_path(), env, buff,
idle_timeout=timeout, job_timeout=timeout,
pexpect_timeout=5
)
heartbeat_stdout = buff.getvalue().encode('utf-8')
buff.close()
for instance in instance_qs:
output = heartbeat_stdout
task_result = {}
if 'capacity_cpu' in task_result and 'capacity_mem' in task_result:
cls.update_capacity(instance, task_result, awx_application_version)
logger.debug('Isolated instance {} successful heartbeat'.format(instance.hostname))
elif instance.capacity == 0:
logger.debug('Isolated instance {} previously marked as lost, could not re-join.'.format(
instance.hostname))
else:
logger.warning('Could not update status of isolated instance {}, msg={}'.format(
instance.hostname, task_result.get('msg', 'unknown failure')
))
if instance.is_lost(isolated=True):
instance.capacity = 0
instance.save(update_fields=['capacity'])
logger.error('Isolated instance {} last checked in at {}, marked as lost.'.format(
instance.hostname, instance.modified))
try:
with open(os.path.join(facts_path, instance.hostname), 'r') as facts_data:
output = facts_data.read()
task_result = json.loads(output)
except Exception:
logger.exception('Failed to read status from isolated instances, output:\n {}'.format(output))
if 'awx_capacity_cpu' in task_result and 'awx_capacity_mem' in task_result:
task_result = {
'capacity_cpu': task_result['awx_capacity_cpu'],
'capacity_mem': task_result['awx_capacity_mem'],
'version': task_result['awx_capacity_version']
}
cls.update_capacity(instance, task_result, awx_application_version)
logger.debug('Isolated instance {} successful heartbeat'.format(instance.hostname))
elif instance.capacity == 0:
logger.debug('Isolated instance {} previously marked as lost, could not re-join.'.format(
instance.hostname))
else:
logger.warning('Could not update status of isolated instance {}'.format(instance.hostname))
if instance.is_lost(isolated=True):
instance.capacity = 0
instance.save(update_fields=['capacity'])
logger.error('Isolated instance {} last checked in at {}, marked as lost.'.format(
instance.hostname, instance.modified))
finally:
if os.path.exists(facts_path):
shutil.rmtree(facts_path)
@staticmethod
def get_stdout_handle(instance, private_data_dir, event_data_key='job_id'):

View File

@@ -4,7 +4,6 @@ import argparse
import base64
import codecs
import collections
import StringIO
import logging
import json
import os
@@ -13,12 +12,15 @@ import pipes
import re
import signal
import sys
import thread
import threading
import time
try:
from io import StringIO
except ImportError:
from StringIO import StringIO
import pexpect
import psutil
import six
logger = logging.getLogger('awx.main.utils.expect')
@@ -49,7 +51,10 @@ def open_fifo_write(path, data):
reads data from the pipe.
'''
os.mkfifo(path, 0o600)
thread.start_new_thread(lambda p, d: open(p, 'w').write(d), (path, data))
threading.Thread(
target=lambda p, d: open(p, 'w').write(d),
args=(path, data)
).start()
def run_pexpect(args, cwd, env, logfile,
@@ -97,14 +102,8 @@ def run_pexpect(args, cwd, env, logfile,
# enforce usage of an OrderedDict so that the ordering of elements in
# `keys()` matches `values()`.
expect_passwords = collections.OrderedDict(expect_passwords)
password_patterns = expect_passwords.keys()
password_values = expect_passwords.values()
# pexpect needs all env vars to be utf-8 encoded strings
# https://github.com/pexpect/pexpect/issues/512
for k, v in env.items():
if isinstance(v, six.text_type):
env[k] = v.encode('utf-8')
password_patterns = list(expect_passwords.keys())
password_values = list(expect_passwords.values())
child = pexpect.spawn(
args[0], args[1:], cwd=cwd, env=env, ignore_sighup=True,
@@ -232,7 +231,11 @@ def handle_termination(pid, args, proot_cmd, is_cancel=True):
instance's cancel_flag.
'''
try:
if proot_cmd in ' '.join(args):
if sys.version_info > (3, 0):
used_proot = proot_cmd.encode('utf-8') in args
else:
used_proot = proot_cmd in ' '.join(args)
if used_proot:
if not psutil:
os.kill(pid, signal.SIGKILL)
else:
@@ -253,8 +256,8 @@ def handle_termination(pid, args, proot_cmd, is_cancel=True):
def __run__(private_data_dir):
buff = StringIO.StringIO()
with open(os.path.join(private_data_dir, 'env'), 'r') as f:
buff = StringIO()
with codecs.open(os.path.join(private_data_dir, 'env'), 'r', encoding='utf-8') as f:
for line in f:
buff.write(line)

View File

@@ -7,7 +7,7 @@ import json
import operator
import re
import six
import urllib
import urllib.parse
from jinja2 import Environment, StrictUndefined
from jinja2.exceptions import UndefinedError, TemplateSyntaxError
@@ -251,6 +251,9 @@ class ImplicitRoleField(models.ForeignKey):
if type(field_name) == tuple:
continue
if type(field_name) == bytes:
field_name = field_name.decode('utf-8')
if field_name.startswith('singleton:'):
continue
@@ -373,7 +376,7 @@ class SmartFilterField(models.TextField):
# https://docs.python.org/2/library/stdtypes.html#truth-value-testing
if not value:
return None
value = urllib.unquote(value)
value = urllib.parse.unquote(value)
try:
SmartFilter().query_from_string(value)
except RuntimeError as e:
@@ -407,9 +410,6 @@ class JSONSchemaField(JSONBField):
self.schema(model_instance),
format_checker=self.format_checker
).iter_errors(value):
# strip Python unicode markers from jsonschema validation errors
error.message = re.sub(r'\bu(\'|")', r'\1', error.message)
if error.validator == 'pattern' and 'error' in error.schema:
error.message = six.text_type(error.schema['error']).format(instance=error.instance)
elif error.validator == 'type':
@@ -514,10 +514,10 @@ class CredentialInputField(JSONSchemaField):
field = field.copy()
if field['type'] == 'become_method':
field.pop('type')
field['choices'] = map(operator.itemgetter(0), CHOICES_PRIVILEGE_ESCALATION_METHODS)
field['choices'] = list(map(operator.itemgetter(0), CHOICES_PRIVILEGE_ESCALATION_METHODS))
properties[field['id']] = field
if field.get('choices', []):
field['enum'] = field['choices'][:]
field['enum'] = list(field['choices'])[:]
return {
'type': 'object',
'properties': properties,
@@ -824,14 +824,14 @@ class CredentialTypeInjectorField(JSONSchemaField):
)
class ExplodingNamespace:
def __unicode__(self):
def __str__(self):
raise UndefinedError(_('Must define unnamed file injector in order to reference `tower.filename`.'))
class TowerNamespace:
def __init__(self):
self.filename = ExplodingNamespace()
def __unicode__(self):
def __str__(self):
raise UndefinedError(_('Cannot directly reference reserved `tower` namespace container.'))
valid_namespace['tower'] = TowerNamespace()

View File

@@ -1,6 +1,7 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved
import datetime
from django.utils.encoding import smart_str
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
@@ -35,10 +36,10 @@ class Command(BaseCommand):
).save()
pemfile = Setting.objects.create(
key='AWX_ISOLATED_PUBLIC_KEY',
value=key.public_key().public_bytes(
value=smart_str(key.public_key().public_bytes(
encoding=serialization.Encoding.OpenSSH,
format=serialization.PublicFormat.OpenSSH
) + " generated-by-awx@%s" % datetime.datetime.utcnow().isoformat()
)) + " generated-by-awx@%s" % datetime.datetime.utcnow().isoformat()
)
pemfile.save()
print(pemfile.value)

View File

@@ -4,6 +4,7 @@
# Python
import json
import logging
import fnmatch
import os
import re
import subprocess
@@ -15,12 +16,20 @@ import shutil
# Django
from django.conf import settings
from django.core.management.base import BaseCommand, CommandError
from django.core.exceptions import ImproperlyConfigured
from django.db import connection, transaction
from django.utils.encoding import smart_text
# AWX
from awx.main.models import * # noqa
# AWX inventory imports
from awx.main.models.inventory import (
Inventory,
InventorySource,
InventoryUpdate,
Host
)
from awx.main.utils.mem_inventory import MemInventory, dict_to_mem_data
# other AWX imports
from awx.main.models.rbac import batch_role_ancestor_rebuilding
from awx.main.utils import (
ignore_inventory_computed_fields,
check_proot_installed,
@@ -28,7 +37,6 @@ from awx.main.utils import (
build_proot_temp_dir,
get_licenser
)
from awx.main.utils.mem_inventory import MemInventory, dict_to_mem_data
from awx.main.signals import disable_activity_stream
from awx.main.constants import STANDARD_INVENTORY_UPDATE_ENV
@@ -63,60 +71,46 @@ class AnsibleInventoryLoader(object):
use the ansible-inventory CLI utility to convert it into in-memory
representational objects. Example:
/usr/bin/ansible/ansible-inventory -i hosts --list
If it fails to find this, it uses the backported script instead
'''
def __init__(self, source, group_filter_re=None, host_filter_re=None, is_custom=False):
def __init__(self, source, is_custom=False, venv_path=None):
self.source = source
self.source_dir = functioning_dir(self.source)
self.is_custom = is_custom
self.tmp_private_dir = None
self.method = 'ansible-inventory'
self.group_filter_re = group_filter_re
self.host_filter_re = host_filter_re
self.is_vendored_source = False
if self.source_dir == os.path.join(settings.BASE_DIR, 'plugins', 'inventory'):
self.is_vendored_source = True
if venv_path:
self.venv_path = venv_path
else:
self.venv_path = settings.ANSIBLE_VENV_PATH
def build_env(self):
env = dict(os.environ.items())
env['VIRTUAL_ENV'] = settings.ANSIBLE_VENV_PATH
env['PATH'] = os.path.join(settings.ANSIBLE_VENV_PATH, "bin") + ":" + env['PATH']
env['VIRTUAL_ENV'] = self.venv_path
env['PATH'] = os.path.join(self.venv_path, "bin") + ":" + env['PATH']
# Set configuration items that should always be used for updates
for key, value in STANDARD_INVENTORY_UPDATE_ENV.items():
if key not in env:
env[key] = value
venv_libdir = os.path.join(settings.ANSIBLE_VENV_PATH, "lib")
venv_libdir = os.path.join(self.venv_path, "lib")
env.pop('PYTHONPATH', None) # default to none if no python_ver matches
if os.path.isdir(os.path.join(venv_libdir, "python2.7")):
env['PYTHONPATH'] = os.path.join(venv_libdir, "python2.7", "site-packages") + ":"
for version in os.listdir(venv_libdir):
if fnmatch.fnmatch(version, 'python[23].*'):
if os.path.isdir(os.path.join(venv_libdir, version)):
env['PYTHONPATH'] = os.path.join(venv_libdir, version, "site-packages") + ":"
break
# For internal inventory updates, these are not reported in the job_env API
logger.info('Using VIRTUAL_ENV: {}'.format(env['VIRTUAL_ENV']))
logger.info('Using PATH: {}'.format(env['PATH']))
logger.info('Using PYTHONPATH: {}'.format(env.get('PYTHONPATH', None)))
return env
def get_base_args(self):
# get ansible-inventory absolute path for running in bubblewrap/proot, in Popen
for path in os.environ["PATH"].split(os.pathsep):
potential_path = os.path.join(path.strip('"'), 'ansible-inventory')
if os.path.isfile(potential_path) and os.access(potential_path, os.X_OK):
logger.debug('Using system install of ansible-inventory CLI: {}'.format(potential_path))
return [potential_path, '-i', self.source]
# Stopgap solution for group_vars, do not use backported module for official
# vendored cloud modules or custom scripts TODO: remove after Ansible 2.3 deprecation
if self.is_vendored_source or self.is_custom:
self.method = 'inventory script invocation'
return [self.source]
# ansible-inventory was not found, look for backported module TODO: remove after Ansible 2.3 deprecation
abs_module_path = os.path.abspath(os.path.join(
os.path.dirname(__file__), '..', '..', '..', 'plugins',
'ansible_inventory', 'backport.py'))
self.method = 'ansible-inventory backport'
if not os.path.exists(abs_module_path):
raise ImproperlyConfigured('Cannot find inventory module')
logger.debug('Using backported ansible-inventory module: {}'.format(abs_module_path))
return [abs_module_path, '-i', self.source]
abs_ansible_inventory = shutil.which('ansible-inventory')
bargs= [abs_ansible_inventory, '-i', self.source]
logger.debug('Using base command: {}'.format(' '.join(bargs)))
return bargs
def get_proot_args(self, cmd, env):
cwd = os.getcwd()
@@ -155,6 +149,8 @@ class AnsibleInventoryLoader(object):
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=env)
stdout, stderr = proc.communicate()
stdout = smart_text(stdout)
stderr = smart_text(stderr)
if self.tmp_private_dir:
shutil.rmtree(self.tmp_private_dir, True)
@@ -177,80 +173,7 @@ class AnsibleInventoryLoader(object):
base_args = self.get_base_args()
logger.info('Reading Ansible inventory source: %s', self.source)
data = self.command_to_json(base_args + ['--list'])
# TODO: remove after we run custom scripts through ansible-inventory
if self.is_custom and '_meta' not in data or 'hostvars' not in data['_meta']:
# Invoke the executable once for each host name we've built up
# to set their variables
data.setdefault('_meta', {})
data['_meta'].setdefault('hostvars', {})
logger.warning('Re-calling script for hostvars individually.')
for group_name, group_data in data.iteritems():
if group_name == '_meta':
continue
if isinstance(group_data, dict):
group_host_list = group_data.get('hosts', [])
elif isinstance(group_data, list):
group_host_list = group_data
else:
logger.warning('Group data for "%s" is not a dict or list',
group_name)
group_host_list = []
for hostname in group_host_list:
logger.debug('Obtaining hostvars for %s' % hostname.encode('utf-8'))
hostdata = self.command_to_json(
base_args + ['--host', hostname.encode("utf-8")]
)
if isinstance(hostdata, dict):
data['_meta']['hostvars'][hostname] = hostdata
else:
logger.warning(
'Expected dict of vars for host "%s" when '
'calling with `--host`, got %s instead',
k, str(type(data))
)
logger.info('Processing JSON output...')
inventory = MemInventory(
group_filter_re=self.group_filter_re, host_filter_re=self.host_filter_re)
inventory = dict_to_mem_data(data, inventory=inventory)
return inventory
def load_inventory_source(source, group_filter_re=None,
host_filter_re=None, exclude_empty_groups=False,
is_custom=False):
'''
Load inventory from given source directory or file.
'''
# Sanity check: We sanitize these module names for our API but Ansible proper doesn't follow
# good naming conventions
source = source.replace('rhv.py', 'ovirt4.py')
source = source.replace('satellite6.py', 'foreman.py')
source = source.replace('vmware.py', 'vmware_inventory.py')
if not os.path.exists(source):
raise IOError('Source does not exist: %s' % source)
source = os.path.join(os.getcwd(), os.path.dirname(source),
os.path.basename(source))
source = os.path.normpath(os.path.abspath(source))
inventory = AnsibleInventoryLoader(
source=source,
group_filter_re=group_filter_re,
host_filter_re=host_filter_re,
is_custom=is_custom).load()
logger.debug('Finished loading from source: %s', source)
# Exclude groups that are completely empty.
if exclude_empty_groups:
inventory.delete_empty_groups()
logger.info('Loaded %d groups, %d hosts', len(inventory.all_group.all_groups),
len(inventory.all_group.all_hosts))
return inventory.all_group
return self.command_to_json(base_args + ['--list'])
class Command(BaseCommand):
@@ -268,6 +191,8 @@ class Command(BaseCommand):
parser.add_argument('--inventory-id', dest='inventory_id', type=int,
default=None, metavar='i',
help='id of inventory to sync')
parser.add_argument('--venv', dest='venv', type=str, default=None,
help='absolute path to the AWX custom virtualenv to use')
parser.add_argument('--overwrite', dest='overwrite', action='store_true', default=False,
help='overwrite the destination hosts and groups')
parser.add_argument('--overwrite-vars', dest='overwrite_vars',
@@ -347,7 +272,7 @@ class Command(BaseCommand):
if enabled is not default:
enabled_value = getattr(self, 'enabled_value', None)
if enabled_value is not None:
enabled = bool(unicode(enabled_value) == unicode(enabled))
enabled = bool(str(enabled_value) == str(enabled))
else:
enabled = bool(enabled)
if enabled is default:
@@ -357,6 +282,19 @@ class Command(BaseCommand):
else:
raise NotImplementedError('Value of enabled {} not understood.'.format(enabled))
def get_source_absolute_path(self, source):
# Sanity check: We sanitize these module names for our API but Ansible proper doesn't follow
# good naming conventions
source = source.replace('rhv.py', 'ovirt4.py')
source = source.replace('satellite6.py', 'foreman.py')
source = source.replace('vmware.py', 'vmware_inventory.py')
if not os.path.exists(source):
raise IOError('Source does not exist: %s' % source)
source = os.path.join(os.getcwd(), os.path.dirname(source),
os.path.basename(source))
source = os.path.normpath(os.path.abspath(source))
return source
def load_inventory_from_database(self):
'''
Load inventory and related objects from the database.
@@ -369,9 +307,9 @@ class Command(BaseCommand):
try:
self.inventory = Inventory.objects.get(**q)
except Inventory.DoesNotExist:
raise CommandError('Inventory with %s = %s cannot be found' % q.items()[0])
raise CommandError('Inventory with %s = %s cannot be found' % list(q.items())[0])
except Inventory.MultipleObjectsReturned:
raise CommandError('Inventory with %s = %s returned multiple results' % q.items()[0])
raise CommandError('Inventory with %s = %s returned multiple results' % list(q.items())[0])
logger.info('Updating inventory %d: %s' % (self.inventory.pk,
self.inventory.name))
@@ -471,7 +409,7 @@ class Command(BaseCommand):
if self.instance_id_var:
all_instance_ids = self.mem_instance_id_map.keys()
instance_ids = []
for offset in xrange(0, len(all_instance_ids), self._batch_size):
for offset in range(0, len(all_instance_ids), self._batch_size):
instance_ids = all_instance_ids[offset:(offset + self._batch_size)]
for host_pk in hosts_qs.filter(instance_id__in=instance_ids).values_list('pk', flat=True):
del_host_pks.discard(host_pk)
@@ -479,14 +417,14 @@ class Command(BaseCommand):
del_host_pks.discard(host_pk)
all_host_names = list(set(self.mem_instance_id_map.values()) - set(self.all_group.all_hosts.keys()))
else:
all_host_names = self.all_group.all_hosts.keys()
for offset in xrange(0, len(all_host_names), self._batch_size):
all_host_names = list(self.all_group.all_hosts.keys())
for offset in range(0, len(all_host_names), self._batch_size):
host_names = all_host_names[offset:(offset + self._batch_size)]
for host_pk in hosts_qs.filter(name__in=host_names).values_list('pk', flat=True):
del_host_pks.discard(host_pk)
# Now delete all remaining hosts in batches.
all_del_pks = sorted(list(del_host_pks))
for offset in xrange(0, len(all_del_pks), self._batch_size):
for offset in range(0, len(all_del_pks), self._batch_size):
del_pks = all_del_pks[offset:(offset + self._batch_size)]
for host in hosts_qs.filter(pk__in=del_pks):
host_name = host.name
@@ -509,8 +447,8 @@ class Command(BaseCommand):
groups_qs = self.inventory_source.groups.all()
# Build list of all group pks, remove those that should not be deleted.
del_group_pks = set(groups_qs.values_list('pk', flat=True))
all_group_names = self.all_group.all_groups.keys()
for offset in xrange(0, len(all_group_names), self._batch_size):
all_group_names = list(self.all_group.all_groups.keys())
for offset in range(0, len(all_group_names), self._batch_size):
group_names = all_group_names[offset:(offset + self._batch_size)]
for group_pk in groups_qs.filter(name__in=group_names).values_list('pk', flat=True):
del_group_pks.discard(group_pk)
@@ -522,7 +460,7 @@ class Command(BaseCommand):
del_group_pks.discard(self.inventory_source.deprecated_group_id)
# Now delete all remaining groups in batches.
all_del_pks = sorted(list(del_group_pks))
for offset in xrange(0, len(all_del_pks), self._batch_size):
for offset in range(0, len(all_del_pks), self._batch_size):
del_pks = all_del_pks[offset:(offset + self._batch_size)]
for group in groups_qs.filter(pk__in=del_pks):
group_name = group.name
@@ -561,7 +499,7 @@ class Command(BaseCommand):
for mem_group in mem_children:
db_children_name_pk_map.pop(mem_group.name, None)
del_child_group_pks = list(set(db_children_name_pk_map.values()))
for offset in xrange(0, len(del_child_group_pks), self._batch_size):
for offset in range(0, len(del_child_group_pks), self._batch_size):
child_group_pks = del_child_group_pks[offset:(offset + self._batch_size)]
for db_child in db_children.filter(pk__in=child_group_pks):
group_group_count += 1
@@ -574,12 +512,12 @@ class Command(BaseCommand):
del_host_pks = set(db_hosts.values_list('pk', flat=True))
mem_hosts = self.all_group.all_groups[db_group.name].hosts
all_mem_host_names = [h.name for h in mem_hosts if not h.instance_id]
for offset in xrange(0, len(all_mem_host_names), self._batch_size):
for offset in range(0, len(all_mem_host_names), self._batch_size):
mem_host_names = all_mem_host_names[offset:(offset + self._batch_size)]
for db_host_pk in db_hosts.filter(name__in=mem_host_names).values_list('pk', flat=True):
del_host_pks.discard(db_host_pk)
all_mem_instance_ids = [h.instance_id for h in mem_hosts if h.instance_id]
for offset in xrange(0, len(all_mem_instance_ids), self._batch_size):
for offset in range(0, len(all_mem_instance_ids), self._batch_size):
mem_instance_ids = all_mem_instance_ids[offset:(offset + self._batch_size)]
for db_host_pk in db_hosts.filter(instance_id__in=mem_instance_ids).values_list('pk', flat=True):
del_host_pks.discard(db_host_pk)
@@ -587,7 +525,7 @@ class Command(BaseCommand):
for db_host_pk in all_db_host_pks:
del_host_pks.discard(db_host_pk)
del_host_pks = list(del_host_pks)
for offset in xrange(0, len(del_host_pks), self._batch_size):
for offset in range(0, len(del_host_pks), self._batch_size):
del_pks = del_host_pks[offset:(offset + self._batch_size)]
for db_host in db_hosts.filter(pk__in=del_pks):
group_host_count += 1
@@ -635,7 +573,7 @@ class Command(BaseCommand):
if len(v.parents) == 1 and v.parents[0].name == 'all':
root_group_names.add(k)
existing_group_names = set()
for offset in xrange(0, len(all_group_names), self._batch_size):
for offset in range(0, len(all_group_names), self._batch_size):
group_names = all_group_names[offset:(offset + self._batch_size)]
for group in self.inventory.groups.filter(name__in=group_names):
mem_group = self.all_group.all_groups[group.name]
@@ -739,7 +677,7 @@ class Command(BaseCommand):
mem_host_instance_id_map = {}
mem_host_name_map = {}
mem_host_names_to_update = set(self.all_group.all_hosts.keys())
for k,v in self.all_group.all_hosts.iteritems():
for k,v in self.all_group.all_hosts.items():
mem_host_name_map[k] = v
instance_id = self._get_instance_id(v.variables)
if instance_id in self.db_instance_id_map:
@@ -749,7 +687,7 @@ class Command(BaseCommand):
# Update all existing hosts where we know the PK based on instance_id.
all_host_pks = sorted(mem_host_pk_map.keys())
for offset in xrange(0, len(all_host_pks), self._batch_size):
for offset in range(0, len(all_host_pks), self._batch_size):
host_pks = all_host_pks[offset:(offset + self._batch_size)]
for db_host in self.inventory.hosts.filter( pk__in=host_pks):
if db_host.pk in host_pks_updated:
@@ -761,7 +699,7 @@ class Command(BaseCommand):
# Update all existing hosts where we know the instance_id.
all_instance_ids = sorted(mem_host_instance_id_map.keys())
for offset in xrange(0, len(all_instance_ids), self._batch_size):
for offset in range(0, len(all_instance_ids), self._batch_size):
instance_ids = all_instance_ids[offset:(offset + self._batch_size)]
for db_host in self.inventory.hosts.filter( instance_id__in=instance_ids):
if db_host.pk in host_pks_updated:
@@ -773,7 +711,7 @@ class Command(BaseCommand):
# Update all existing hosts by name.
all_host_names = sorted(mem_host_name_map.keys())
for offset in xrange(0, len(all_host_names), self._batch_size):
for offset in range(0, len(all_host_names), self._batch_size):
host_names = all_host_names[offset:(offset + self._batch_size)]
for db_host in self.inventory.hosts.filter( name__in=host_names):
if db_host.pk in host_pks_updated:
@@ -815,15 +753,15 @@ class Command(BaseCommand):
'''
if settings.SQL_DEBUG:
queries_before = len(connection.queries)
all_group_names = sorted([k for k,v in self.all_group.all_groups.iteritems() if v.children])
all_group_names = sorted([k for k,v in self.all_group.all_groups.items() if v.children])
group_group_count = 0
for offset in xrange(0, len(all_group_names), self._batch_size):
for offset in range(0, len(all_group_names), self._batch_size):
group_names = all_group_names[offset:(offset + self._batch_size)]
for db_group in self.inventory.groups.filter(name__in=group_names):
mem_group = self.all_group.all_groups[db_group.name]
group_group_count += len(mem_group.children)
all_child_names = sorted([g.name for g in mem_group.children])
for offset2 in xrange(0, len(all_child_names), self._batch_size):
for offset2 in range(0, len(all_child_names), self._batch_size):
child_names = all_child_names[offset2:(offset2 + self._batch_size)]
db_children_qs = self.inventory.groups.filter(name__in=child_names)
for db_child in db_children_qs.filter(children__id=db_group.id):
@@ -842,15 +780,15 @@ class Command(BaseCommand):
# belongs.
if settings.SQL_DEBUG:
queries_before = len(connection.queries)
all_group_names = sorted([k for k,v in self.all_group.all_groups.iteritems() if v.hosts])
all_group_names = sorted([k for k,v in self.all_group.all_groups.items() if v.hosts])
group_host_count = 0
for offset in xrange(0, len(all_group_names), self._batch_size):
for offset in range(0, len(all_group_names), self._batch_size):
group_names = all_group_names[offset:(offset + self._batch_size)]
for db_group in self.inventory.groups.filter(name__in=group_names):
mem_group = self.all_group.all_groups[db_group.name]
group_host_count += len(mem_group.hosts)
all_host_names = sorted([h.name for h in mem_group.hosts if not h.instance_id])
for offset2 in xrange(0, len(all_host_names), self._batch_size):
for offset2 in range(0, len(all_host_names), self._batch_size):
host_names = all_host_names[offset2:(offset2 + self._batch_size)]
db_hosts_qs = self.inventory.hosts.filter(name__in=host_names)
for db_host in db_hosts_qs.filter(groups__id=db_group.id):
@@ -859,7 +797,7 @@ class Command(BaseCommand):
self._batch_add_m2m(db_group.hosts, db_host)
logger.debug('Host "%s" added to group "%s"', db_host.name, db_group.name)
all_instance_ids = sorted([h.instance_id for h in mem_group.hosts if h.instance_id])
for offset2 in xrange(0, len(all_instance_ids), self._batch_size):
for offset2 in range(0, len(all_instance_ids), self._batch_size):
instance_ids = all_instance_ids[offset2:(offset2 + self._batch_size)]
db_hosts_qs = self.inventory.hosts.filter(instance_id__in=instance_ids)
for db_host in db_hosts_qs.filter(groups__id=db_group.id):
@@ -926,6 +864,7 @@ class Command(BaseCommand):
self.set_logging_level()
self.inventory_name = options.get('inventory_name', None)
self.inventory_id = options.get('inventory_id', None)
venv_path = options.get('venv', None)
self.overwrite = bool(options.get('overwrite', False))
self.overwrite_vars = bool(options.get('overwrite_vars', False))
self.keep_vars = bool(options.get('keep_vars', False))
@@ -986,12 +925,26 @@ class Command(BaseCommand):
self.inventory_update.status = 'running'
self.inventory_update.save()
# Load inventory from source.
self.all_group = load_inventory_source(self.source,
self.group_filter_re,
self.host_filter_re,
self.exclude_empty_groups,
self.is_custom)
source = self.get_source_absolute_path(self.source)
data = AnsibleInventoryLoader(source=source, is_custom=self.is_custom, venv_path=venv_path).load()
logger.debug('Finished loading from source: %s', source)
logger.info('Processing JSON output...')
inventory = MemInventory(
group_filter_re=self.group_filter_re, host_filter_re=self.host_filter_re)
inventory = dict_to_mem_data(data, inventory=inventory)
del data # forget dict from import, could be large
logger.info('Loaded %d groups, %d hosts', len(inventory.all_group.all_groups),
len(inventory.all_group.all_hosts))
if self.exclude_empty_groups:
inventory.delete_empty_groups()
self.all_group = inventory.all_group
if settings.DEBUG:
# depending on inventory source, this output can be
# *exceedingly* verbose - crawling a deeply nested
@@ -1074,4 +1027,4 @@ class Command(BaseCommand):
if exc and isinstance(exc, CommandError):
sys.exit(1)
elif exc:
raise
raise exc

View File

@@ -19,11 +19,11 @@ class InstanceNotFound(Exception):
class Command(BaseCommand):
def add_arguments(self, parser):
parser.add_argument('--queuename', dest='queuename', type=lambda s: six.text_type(s, 'utf8'),
parser.add_argument('--queuename', dest='queuename', type=str,
help='Queue to create/update')
parser.add_argument('--hostnames', dest='hostnames', type=lambda s: six.text_type(s, 'utf8'),
parser.add_argument('--hostnames', dest='hostnames', type=str,
help='Comma-Delimited Hosts to add to the Queue (will not remove already assigned instances)')
parser.add_argument('--controller', dest='controller', type=lambda s: six.text_type(s, 'utf8'),
parser.add_argument('--controller', dest='controller', type=str,
default='', help='The controlling group (makes this an isolated group)')
parser.add_argument('--instance_percent', dest='instance_percent', type=int, default=0,
help='The percentage of active instances that will be assigned to this group'),

View File

@@ -154,7 +154,7 @@ class ReplayJobEvents(JobStatusLifeCycle):
continue
if debug:
raw_input("{} of {}:".format(n, job_event_count))
input("{} of {}:".format(n, job_event_count))
if not je_previous:
stats['recording_start'] = je_current.created

View File

@@ -19,7 +19,7 @@ logger = logging.getLogger('awx.main.dispatch')
def construct_bcast_queue_name(common_name):
return common_name.encode('utf8') + '_' + settings.CLUSTER_HOST_ID
return common_name + '_' + settings.CLUSTER_HOST_ID
class Command(BaseCommand):
@@ -69,21 +69,42 @@ class Command(BaseCommand):
return TaskResult()
sched_file = '/var/lib/awx/beat.db'
app = Celery()
app.conf.BROKER_URL = settings.BROKER_URL
app.conf.CELERY_TASK_RESULT_EXPIRES = False
# celery in py3 seems to have a bug where the celerybeat schedule
# shelve can become corrupted; we've _only_ seen this in Ubuntu and py36
# it can be avoided by detecting and removing the corrupted file
# at some point, we'll just stop using celerybeat, because it's clearly
# buggy, too -_-
#
# https://github.com/celery/celery/issues/4777
sched = AWXScheduler(schedule_filename=sched_file, app=app)
try:
sched.setup_schedule()
except Exception:
logger.exception('{} is corrupted, removing.'.format(sched_file))
sched._remove_db()
finally:
try:
sched.close()
except Exception:
logger.exception('{} failed to sync/close'.format(sched_file))
beat.Beat(
30,
app,
schedule='/var/lib/awx/beat.db', scheduler_cls=AWXScheduler
schedule=sched_file, scheduler_cls=AWXScheduler
).run()
def handle(self, *arg, **options):
if options.get('status'):
print Control('dispatcher').status()
print(Control('dispatcher').status())
return
if options.get('running'):
print Control('dispatcher').running()
print(Control('dispatcher').running())
return
if options.get('reload'):
return Control('dispatcher').control({'control': 'reload'})

View File

@@ -126,8 +126,9 @@ class SessionTimeoutMiddleware(object):
"""
def process_response(self, request, response):
should_skip = 'HTTP_X_WS_SESSION_QUIET' in request.META
req_session = getattr(request, 'session', None)
if req_session and not req_session.is_empty():
if req_session and not req_session.is_empty() and should_skip is False:
expiry = int(settings.SESSION_COOKIE_AGE)
request.session.set_expiry(expiry)
response['Session-Timeout'] = expiry

View File

@@ -27,7 +27,7 @@ class Migration(migrations.Migration):
name='ActivityStream',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('operation', models.CharField(max_length=13, choices=[(b'create', 'Entity Created'), (b'update', 'Entity Updated'), (b'delete', 'Entity Deleted'), (b'associate', 'Entity Associated with another Entity'), (b'disassociate', 'Entity was Disassociated with another Entity')])),
('operation', models.CharField(max_length=13, choices=[('create', 'Entity Created'), ('update', 'Entity Updated'), ('delete', 'Entity Deleted'), ('associate', 'Entity Associated with another Entity'), ('disassociate', 'Entity was Disassociated with another Entity')])),
('timestamp', models.DateTimeField(auto_now_add=True)),
('changes', models.TextField(blank=True)),
('object_relationship_type', models.TextField(blank=True)),
@@ -42,8 +42,8 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('host_name', models.CharField(default=b'', max_length=1024, editable=False)),
('event', models.CharField(max_length=100, choices=[(b'runner_on_failed', 'Host Failed'), (b'runner_on_ok', 'Host OK'), (b'runner_on_unreachable', 'Host Unreachable'), (b'runner_on_skipped', 'Host Skipped')])),
('host_name', models.CharField(default='', max_length=1024, editable=False)),
('event', models.CharField(max_length=100, choices=[('runner_on_failed', 'Host Failed'), ('runner_on_ok', 'Host OK'), ('runner_on_unreachable', 'Host Unreachable'), ('runner_on_skipped', 'Host Skipped')])),
('event_data', jsonfield.fields.JSONField(default={}, blank=True)),
('failed', models.BooleanField(default=False, editable=False)),
('changed', models.BooleanField(default=False, editable=False)),
@@ -60,8 +60,8 @@ class Migration(migrations.Migration):
('created', models.DateTimeField(auto_now_add=True)),
('modified', models.DateTimeField(auto_now=True)),
('expires', models.DateTimeField(default=django.utils.timezone.now)),
('request_hash', models.CharField(default=b'', max_length=40, blank=True)),
('reason', models.CharField(default=b'', help_text='Reason the auth token was invalidated.', max_length=1024, blank=True)),
('request_hash', models.CharField(default='', max_length=40, blank=True)),
('reason', models.CharField(default='', help_text='Reason the auth token was invalidated.', max_length=1024, blank=True)),
('user', models.ForeignKey(related_name='auth_tokens', to=settings.AUTH_USER_MODEL)),
],
),
@@ -71,22 +71,22 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('kind', models.CharField(default=b'ssh', max_length=32, choices=[(b'ssh', 'Machine'), (b'scm', 'Source Control'), (b'aws', 'Amazon Web Services'), (b'rax', 'Rackspace'), (b'vmware', 'VMware vCenter'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure'), (b'openstack', 'OpenStack')])),
('kind', models.CharField(default='ssh', max_length=32, choices=[('ssh', 'Machine'), ('scm', 'Source Control'), ('aws', 'Amazon Web Services'), ('rax', 'Rackspace'), ('vmware', 'VMware vCenter'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure'), ('openstack', 'OpenStack')])),
('cloud', models.BooleanField(default=False, editable=False)),
('host', models.CharField(default=b'', help_text='The hostname or IP address to use.', max_length=1024, verbose_name='Host', blank=True)),
('username', models.CharField(default=b'', help_text='Username for this credential.', max_length=1024, verbose_name='Username', blank=True)),
('password', models.CharField(default=b'', help_text='Password for this credential (or "ASK" to prompt the user for machine credentials).', max_length=1024, verbose_name='Password', blank=True)),
('security_token', models.CharField(default=b'', help_text='Security Token for this credential', max_length=1024, verbose_name='Security Token', blank=True)),
('project', models.CharField(default=b'', help_text='The identifier for the project.', max_length=100, verbose_name='Project', blank=True)),
('ssh_key_data', models.TextField(default=b'', help_text='RSA or DSA private key to be used instead of password.', verbose_name='SSH private key', blank=True)),
('ssh_key_unlock', models.CharField(default=b'', help_text='Passphrase to unlock SSH private key if encrypted (or "ASK" to prompt the user for machine credentials).', max_length=1024, verbose_name='SSH key unlock', blank=True)),
('become_method', models.CharField(default=b'', help_text='Privilege escalation method.', max_length=32, blank=True, choices=[(b'', 'None'), (b'sudo', 'Sudo'), (b'su', 'Su'), (b'pbrun', 'Pbrun'), (b'pfexec', 'Pfexec')])),
('become_username', models.CharField(default=b'', help_text='Privilege escalation username.', max_length=1024, blank=True)),
('become_password', models.CharField(default=b'', help_text='Password for privilege escalation method.', max_length=1024, blank=True)),
('vault_password', models.CharField(default=b'', help_text='Vault password (or "ASK" to prompt the user).', max_length=1024, blank=True)),
('host', models.CharField(default='', help_text='The hostname or IP address to use.', max_length=1024, verbose_name='Host', blank=True)),
('username', models.CharField(default='', help_text='Username for this credential.', max_length=1024, verbose_name='Username', blank=True)),
('password', models.CharField(default='', help_text='Password for this credential (or "ASK" to prompt the user for machine credentials).', max_length=1024, verbose_name='Password', blank=True)),
('security_token', models.CharField(default='', help_text='Security Token for this credential', max_length=1024, verbose_name='Security Token', blank=True)),
('project', models.CharField(default='', help_text='The identifier for the project.', max_length=100, verbose_name='Project', blank=True)),
('ssh_key_data', models.TextField(default='', help_text='RSA or DSA private key to be used instead of password.', verbose_name='SSH private key', blank=True)),
('ssh_key_unlock', models.CharField(default='', help_text='Passphrase to unlock SSH private key if encrypted (or "ASK" to prompt the user for machine credentials).', max_length=1024, verbose_name='SSH key unlock', blank=True)),
('become_method', models.CharField(default='', help_text='Privilege escalation method.', max_length=32, blank=True, choices=[('', 'None'), ('sudo', 'Sudo'), ('su', 'Su'), ('pbrun', 'Pbrun'), ('pfexec', 'Pfexec')])),
('become_username', models.CharField(default='', help_text='Privilege escalation username.', max_length=1024, blank=True)),
('become_password', models.CharField(default='', help_text='Password for privilege escalation method.', max_length=1024, blank=True)),
('vault_password', models.CharField(default='', help_text='Vault password (or "ASK" to prompt the user).', max_length=1024, blank=True)),
('created_by', models.ForeignKey(related_name="{u'class': 'credential', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('modified_by', models.ForeignKey(related_name="{u'class': 'credential', u'app_label': 'main'}(class)s_modified+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('tags', taggit.managers.TaggableManager(to='taggit.Tag', through='taggit.TaggedItem', blank=True, help_text='A comma-separated list of tags.', verbose_name='Tags')),
@@ -101,10 +101,10 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('script', models.TextField(default=b'', help_text='Inventory script contents', blank=True)),
('script', models.TextField(default='', help_text='Inventory script contents', blank=True)),
('created_by', models.ForeignKey(related_name="{u'class': 'custominventoryscript', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('modified_by', models.ForeignKey(related_name="{u'class': 'custominventoryscript', u'app_label': 'main'}(class)s_modified+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
@@ -118,10 +118,10 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('variables', models.TextField(default=b'', help_text='Group variables in JSON or YAML format.', blank=True)),
('variables', models.TextField(default='', help_text='Group variables in JSON or YAML format.', blank=True)),
('total_hosts', models.PositiveIntegerField(default=0, help_text='Total number of hosts directly or indirectly in this group.', editable=False)),
('has_active_failures', models.BooleanField(default=False, help_text='Flag indicating whether this group has any hosts with active failures.', editable=False)),
('hosts_with_active_failures', models.PositiveIntegerField(default=0, help_text='Number of hosts in this group with active failures.', editable=False)),
@@ -140,12 +140,12 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('enabled', models.BooleanField(default=True, help_text='Is this host online and available for running jobs?')),
('instance_id', models.CharField(default=b'', max_length=100, blank=True)),
('variables', models.TextField(default=b'', help_text='Host variables in JSON or YAML format.', blank=True)),
('instance_id', models.CharField(default='', max_length=100, blank=True)),
('variables', models.TextField(default='', help_text='Host variables in JSON or YAML format.', blank=True)),
('has_active_failures', models.BooleanField(default=False, help_text='Flag indicating whether the last job failed for this host.', editable=False)),
('has_inventory_sources', models.BooleanField(default=False, help_text='Flag indicating whether this host was created/updated from any external inventory sources.', editable=False)),
('created_by', models.ForeignKey(related_name="{u'class': 'host', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
@@ -171,10 +171,10 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(unique=True, max_length=512)),
('variables', models.TextField(default=b'', help_text='Inventory variables in JSON or YAML format.', blank=True)),
('variables', models.TextField(default='', help_text='Inventory variables in JSON or YAML format.', blank=True)),
('has_active_failures', models.BooleanField(default=False, help_text='Flag indicating whether any hosts in this inventory have failed.', editable=False)),
('total_hosts', models.PositiveIntegerField(default=0, help_text='Total number of hosts in this inventory.', editable=False)),
('hosts_with_active_failures', models.PositiveIntegerField(default=0, help_text='Number of hosts in this inventory with active failures.', editable=False)),
@@ -197,14 +197,14 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('event', models.CharField(max_length=100, choices=[(b'runner_on_failed', 'Host Failed'), (b'runner_on_ok', 'Host OK'), (b'runner_on_error', 'Host Failure'), (b'runner_on_skipped', 'Host Skipped'), (b'runner_on_unreachable', 'Host Unreachable'), (b'runner_on_no_hosts', 'No Hosts Remaining'), (b'runner_on_async_poll', 'Host Polling'), (b'runner_on_async_ok', 'Host Async OK'), (b'runner_on_async_failed', 'Host Async Failure'), (b'runner_on_file_diff', 'File Difference'), (b'playbook_on_start', 'Playbook Started'), (b'playbook_on_notify', 'Running Handlers'), (b'playbook_on_no_hosts_matched', 'No Hosts Matched'), (b'playbook_on_no_hosts_remaining', 'No Hosts Remaining'), (b'playbook_on_task_start', 'Task Started'), (b'playbook_on_vars_prompt', 'Variables Prompted'), (b'playbook_on_setup', 'Gathering Facts'), (b'playbook_on_import_for_host', 'internal: on Import for Host'), (b'playbook_on_not_import_for_host', 'internal: on Not Import for Host'), (b'playbook_on_play_start', 'Play Started'), (b'playbook_on_stats', 'Playbook Complete')])),
('event', models.CharField(max_length=100, choices=[('runner_on_failed', 'Host Failed'), ('runner_on_ok', 'Host OK'), ('runner_on_error', 'Host Failure'), ('runner_on_skipped', 'Host Skipped'), ('runner_on_unreachable', 'Host Unreachable'), ('runner_on_no_hosts', 'No Hosts Remaining'), ('runner_on_async_poll', 'Host Polling'), ('runner_on_async_ok', 'Host Async OK'), ('runner_on_async_failed', 'Host Async Failure'), ('runner_on_file_diff', 'File Difference'), ('playbook_on_start', 'Playbook Started'), ('playbook_on_notify', 'Running Handlers'), ('playbook_on_no_hosts_matched', 'No Hosts Matched'), ('playbook_on_no_hosts_remaining', 'No Hosts Remaining'), ('playbook_on_task_start', 'Task Started'), ('playbook_on_vars_prompt', 'Variables Prompted'), ('playbook_on_setup', 'Gathering Facts'), ('playbook_on_import_for_host', 'internal: on Import for Host'), ('playbook_on_not_import_for_host', 'internal: on Not Import for Host'), ('playbook_on_play_start', 'Play Started'), ('playbook_on_stats', 'Playbook Complete')])),
('event_data', jsonfield.fields.JSONField(default={}, blank=True)),
('failed', models.BooleanField(default=False, editable=False)),
('changed', models.BooleanField(default=False, editable=False)),
('host_name', models.CharField(default=b'', max_length=1024, editable=False)),
('play', models.CharField(default=b'', max_length=1024, editable=False)),
('role', models.CharField(default=b'', max_length=1024, editable=False)),
('task', models.CharField(default=b'', max_length=1024, editable=False)),
('host_name', models.CharField(default='', max_length=1024, editable=False)),
('play', models.CharField(default='', max_length=1024, editable=False)),
('role', models.CharField(default='', max_length=1024, editable=False)),
('task', models.CharField(default='', max_length=1024, editable=False)),
('counter', models.PositiveIntegerField(default=0)),
('host', models.ForeignKey(related_name='job_events_as_primary_host', on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to='main.Host', null=True)),
('hosts', models.ManyToManyField(related_name='job_events', editable=False, to='main.Host')),
@@ -220,7 +220,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('host_name', models.CharField(default=b'', max_length=1024, editable=False)),
('host_name', models.CharField(default='', max_length=1024, editable=False)),
('changed', models.PositiveIntegerField(default=0, editable=False)),
('dark', models.PositiveIntegerField(default=0, editable=False)),
('failures', models.PositiveIntegerField(default=0, editable=False)),
@@ -250,7 +250,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(unique=True, max_length=512)),
('admins', models.ManyToManyField(related_name='admin_of_organizations', to=settings.AUTH_USER_MODEL, blank=True)),
@@ -269,10 +269,10 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('permission_type', models.CharField(max_length=64, choices=[(b'read', 'Read Inventory'), (b'write', 'Edit Inventory'), (b'admin', 'Administrate Inventory'), (b'run', 'Deploy To Inventory'), (b'check', 'Deploy To Inventory (Dry Run)'), (b'scan', 'Scan an Inventory'), (b'create', 'Create a Job Template')])),
('permission_type', models.CharField(max_length=64, choices=[('read', 'Read Inventory'), ('write', 'Edit Inventory'), ('admin', 'Administrate Inventory'), ('run', 'Deploy To Inventory'), ('check', 'Deploy To Inventory (Dry Run)'), ('scan', 'Scan an Inventory'), ('create', 'Create a Job Template')])),
('run_ad_hoc_commands', models.BooleanField(default=False, help_text='Execute Commands on the Inventory')),
('created_by', models.ForeignKey(related_name="{u'class': 'permission', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('inventory', models.ForeignKey(related_name='permissions', on_delete=django.db.models.deletion.SET_NULL, to='main.Inventory', null=True)),
@@ -286,7 +286,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('ldap_dn', models.CharField(default=b'', max_length=1024)),
('ldap_dn', models.CharField(default='', max_length=1024)),
('user', awx.main.fields.AutoOneToOneField(related_name='profile', editable=False, to=settings.AUTH_USER_MODEL)),
],
),
@@ -296,7 +296,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(unique=True, max_length=512)),
('enabled', models.BooleanField(default=True)),
@@ -319,7 +319,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('created_by', models.ForeignKey(related_name="{u'class': 'team', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
@@ -338,26 +338,26 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('old_pk', models.PositiveIntegerField(default=None, null=True, editable=False)),
('launch_type', models.CharField(default=b'manual', max_length=20, editable=False, choices=[(b'manual', 'Manual'), (b'relaunch', 'Relaunch'), (b'callback', 'Callback'), (b'scheduled', 'Scheduled'), (b'dependency', 'Dependency')])),
('launch_type', models.CharField(default='manual', max_length=20, editable=False, choices=[('manual', 'Manual'), ('relaunch', 'Relaunch'), ('callback', 'Callback'), ('scheduled', 'Scheduled'), ('dependency', 'Dependency')])),
('cancel_flag', models.BooleanField(default=False, editable=False)),
('status', models.CharField(default=b'new', max_length=20, editable=False, choices=[(b'new', 'New'), (b'pending', 'Pending'), (b'waiting', 'Waiting'), (b'running', 'Running'), (b'successful', 'Successful'), (b'failed', 'Failed'), (b'error', 'Error'), (b'canceled', 'Canceled')])),
('status', models.CharField(default='new', max_length=20, editable=False, choices=[('new', 'New'), ('pending', 'Pending'), ('waiting', 'Waiting'), ('running', 'Running'), ('successful', 'Successful'), ('failed', 'Failed'), ('error', 'Error'), ('canceled', 'Canceled')])),
('failed', models.BooleanField(default=False, editable=False)),
('started', models.DateTimeField(default=None, null=True, editable=False)),
('finished', models.DateTimeField(default=None, null=True, editable=False)),
('elapsed', models.DecimalField(editable=False, max_digits=12, decimal_places=3)),
('job_args', models.TextField(default=b'', editable=False, blank=True)),
('job_cwd', models.CharField(default=b'', max_length=1024, editable=False, blank=True)),
('job_args', models.TextField(default='', editable=False, blank=True)),
('job_cwd', models.CharField(default='', max_length=1024, editable=False, blank=True)),
('job_env', jsonfield.fields.JSONField(default={}, editable=False, blank=True)),
('job_explanation', models.TextField(default=b'', editable=False, blank=True)),
('start_args', models.TextField(default=b'', editable=False, blank=True)),
('result_stdout_text', models.TextField(default=b'', editable=False, blank=True)),
('result_stdout_file', models.TextField(default=b'', editable=False, blank=True)),
('result_traceback', models.TextField(default=b'', editable=False, blank=True)),
('celery_task_id', models.CharField(default=b'', max_length=100, editable=False, blank=True)),
('job_explanation', models.TextField(default='', editable=False, blank=True)),
('start_args', models.TextField(default='', editable=False, blank=True)),
('result_stdout_text', models.TextField(default='', editable=False, blank=True)),
('result_stdout_file', models.TextField(default='', editable=False, blank=True)),
('result_traceback', models.TextField(default='', editable=False, blank=True)),
('celery_task_id', models.CharField(default='', max_length=100, editable=False, blank=True)),
],
),
migrations.CreateModel(
@@ -366,7 +366,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('active', models.BooleanField(default=True, editable=False)),
('name', models.CharField(max_length=512)),
('old_pk', models.PositiveIntegerField(default=None, null=True, editable=False)),
@@ -374,19 +374,19 @@ class Migration(migrations.Migration):
('last_job_run', models.DateTimeField(default=None, null=True, editable=False)),
('has_schedules', models.BooleanField(default=False, editable=False)),
('next_job_run', models.DateTimeField(default=None, null=True, editable=False)),
('status', models.CharField(default=b'ok', max_length=32, editable=False, choices=[(b'new', 'New'), (b'pending', 'Pending'), (b'waiting', 'Waiting'), (b'running', 'Running'), (b'successful', 'Successful'), (b'failed', 'Failed'), (b'error', 'Error'), (b'canceled', 'Canceled'), (b'never updated', b'Never Updated'), (b'ok', b'OK'), (b'missing', b'Missing'), (b'none', 'No External Source'), (b'updating', 'Updating')])),
('status', models.CharField(default='ok', max_length=32, editable=False, choices=[('new', 'New'), ('pending', 'Pending'), ('waiting', 'Waiting'), ('running', 'Running'), ('successful', 'Successful'), ('failed', 'Failed'), ('error', 'Error'), ('canceled', 'Canceled'), ('never updated', 'Never Updated'), ('ok', 'OK'), ('missing', 'Missing'), ('none', 'No External Source'), ('updating', 'Updating')])),
],
),
migrations.CreateModel(
name='AdHocCommand',
fields=[
('unifiedjob_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('job_type', models.CharField(default=b'run', max_length=64, choices=[(b'run', 'Run'), (b'check', 'Check')])),
('limit', models.CharField(default=b'', max_length=1024, blank=True)),
('module_name', models.CharField(default=b'', max_length=1024, blank=True)),
('module_args', models.TextField(default=b'', blank=True)),
('job_type', models.CharField(default='run', max_length=64, choices=[('run', 'Run'), ('check', 'Check')])),
('limit', models.CharField(default='', max_length=1024, blank=True)),
('module_name', models.CharField(default='', max_length=1024, blank=True)),
('module_args', models.TextField(default='', blank=True)),
('forks', models.PositiveIntegerField(default=0, blank=True)),
('verbosity', models.PositiveIntegerField(default=0, blank=True, choices=[(0, b'0 (Normal)'), (1, b'1 (Verbose)'), (2, b'2 (More Verbose)'), (3, b'3 (Debug)'), (4, b'4 (Connection Debug)'), (5, b'5 (WinRM Debug)')])),
('verbosity', models.PositiveIntegerField(default=0, blank=True, choices=[(0, '0 (Normal)'), (1, '1 (Verbose)'), (2, '2 (More Verbose)'), (3, '3 (Debug)'), (4, '4 (Connection Debug)'), (5, '5 (WinRM Debug)')])),
('become_enabled', models.BooleanField(default=False)),
],
bases=('main.unifiedjob',),
@@ -395,12 +395,12 @@ class Migration(migrations.Migration):
name='InventorySource',
fields=[
('unifiedjobtemplate_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJobTemplate')),
('source', models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure'), (b'vmware', 'VMware vCenter'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')])),
('source_path', models.CharField(default=b'', max_length=1024, editable=False, blank=True)),
('source_vars', models.TextField(default=b'', help_text='Inventory source variables in YAML or JSON format.', blank=True)),
('source_regions', models.CharField(default=b'', max_length=1024, blank=True)),
('instance_filters', models.CharField(default=b'', help_text='Comma-separated list of filter expressions (EC2 only). Hosts are imported when ANY of the filters match.', max_length=1024, blank=True)),
('group_by', models.CharField(default=b'', help_text='Limit groups automatically created from inventory source (EC2 only).', max_length=1024, blank=True)),
('source', models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure'), ('vmware', 'VMware vCenter'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')])),
('source_path', models.CharField(default='', max_length=1024, editable=False, blank=True)),
('source_vars', models.TextField(default='', help_text='Inventory source variables in YAML or JSON format.', blank=True)),
('source_regions', models.CharField(default='', max_length=1024, blank=True)),
('instance_filters', models.CharField(default='', help_text='Comma-separated list of filter expressions (EC2 only). Hosts are imported when ANY of the filters match.', max_length=1024, blank=True)),
('group_by', models.CharField(default='', help_text='Limit groups automatically created from inventory source (EC2 only).', max_length=1024, blank=True)),
('overwrite', models.BooleanField(default=False, help_text='Overwrite local groups and hosts from remote inventory source.')),
('overwrite_vars', models.BooleanField(default=False, help_text='Overwrite local variables from remote inventory source.')),
('update_on_launch', models.BooleanField(default=False)),
@@ -412,12 +412,12 @@ class Migration(migrations.Migration):
name='InventoryUpdate',
fields=[
('unifiedjob_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('source', models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure'), (b'vmware', 'VMware vCenter'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')])),
('source_path', models.CharField(default=b'', max_length=1024, editable=False, blank=True)),
('source_vars', models.TextField(default=b'', help_text='Inventory source variables in YAML or JSON format.', blank=True)),
('source_regions', models.CharField(default=b'', max_length=1024, blank=True)),
('instance_filters', models.CharField(default=b'', help_text='Comma-separated list of filter expressions (EC2 only). Hosts are imported when ANY of the filters match.', max_length=1024, blank=True)),
('group_by', models.CharField(default=b'', help_text='Limit groups automatically created from inventory source (EC2 only).', max_length=1024, blank=True)),
('source', models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure'), ('vmware', 'VMware vCenter'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')])),
('source_path', models.CharField(default='', max_length=1024, editable=False, blank=True)),
('source_vars', models.TextField(default='', help_text='Inventory source variables in YAML or JSON format.', blank=True)),
('source_regions', models.CharField(default='', max_length=1024, blank=True)),
('instance_filters', models.CharField(default='', help_text='Comma-separated list of filter expressions (EC2 only). Hosts are imported when ANY of the filters match.', max_length=1024, blank=True)),
('group_by', models.CharField(default='', help_text='Limit groups automatically created from inventory source (EC2 only).', max_length=1024, blank=True)),
('overwrite', models.BooleanField(default=False, help_text='Overwrite local groups and hosts from remote inventory source.')),
('overwrite_vars', models.BooleanField(default=False, help_text='Overwrite local variables from remote inventory source.')),
('license_error', models.BooleanField(default=False, editable=False)),
@@ -428,16 +428,16 @@ class Migration(migrations.Migration):
name='Job',
fields=[
('unifiedjob_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('job_type', models.CharField(default=b'run', max_length=64, choices=[(b'run', 'Run'), (b'check', 'Check'), (b'scan', 'Scan')])),
('playbook', models.CharField(default=b'', max_length=1024, blank=True)),
('job_type', models.CharField(default='run', max_length=64, choices=[('run', 'Run'), ('check', 'Check'), ('scan', 'Scan')])),
('playbook', models.CharField(default='', max_length=1024, blank=True)),
('forks', models.PositiveIntegerField(default=0, blank=True)),
('limit', models.CharField(default=b'', max_length=1024, blank=True)),
('verbosity', models.PositiveIntegerField(default=0, blank=True, choices=[(0, b'0 (Normal)'), (1, b'1 (Verbose)'), (2, b'2 (More Verbose)'), (3, b'3 (Debug)'), (4, b'4 (Connection Debug)'), (5, b'5 (WinRM Debug)')])),
('extra_vars', models.TextField(default=b'', blank=True)),
('job_tags', models.CharField(default=b'', max_length=1024, blank=True)),
('limit', models.CharField(default='', max_length=1024, blank=True)),
('verbosity', models.PositiveIntegerField(default=0, blank=True, choices=[(0, '0 (Normal)'), (1, '1 (Verbose)'), (2, '2 (More Verbose)'), (3, '3 (Debug)'), (4, '4 (Connection Debug)'), (5, '5 (WinRM Debug)')])),
('extra_vars', models.TextField(default='', blank=True)),
('job_tags', models.CharField(default='', max_length=1024, blank=True)),
('force_handlers', models.BooleanField(default=False)),
('skip_tags', models.CharField(default=b'', max_length=1024, blank=True)),
('start_at_task', models.CharField(default=b'', max_length=1024, blank=True)),
('skip_tags', models.CharField(default='', max_length=1024, blank=True)),
('start_at_task', models.CharField(default='', max_length=1024, blank=True)),
('become_enabled', models.BooleanField(default=False)),
],
options={
@@ -449,18 +449,18 @@ class Migration(migrations.Migration):
name='JobTemplate',
fields=[
('unifiedjobtemplate_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJobTemplate')),
('job_type', models.CharField(default=b'run', max_length=64, choices=[(b'run', 'Run'), (b'check', 'Check'), (b'scan', 'Scan')])),
('playbook', models.CharField(default=b'', max_length=1024, blank=True)),
('job_type', models.CharField(default='run', max_length=64, choices=[('run', 'Run'), ('check', 'Check'), ('scan', 'Scan')])),
('playbook', models.CharField(default='', max_length=1024, blank=True)),
('forks', models.PositiveIntegerField(default=0, blank=True)),
('limit', models.CharField(default=b'', max_length=1024, blank=True)),
('verbosity', models.PositiveIntegerField(default=0, blank=True, choices=[(0, b'0 (Normal)'), (1, b'1 (Verbose)'), (2, b'2 (More Verbose)'), (3, b'3 (Debug)'), (4, b'4 (Connection Debug)'), (5, b'5 (WinRM Debug)')])),
('extra_vars', models.TextField(default=b'', blank=True)),
('job_tags', models.CharField(default=b'', max_length=1024, blank=True)),
('limit', models.CharField(default='', max_length=1024, blank=True)),
('verbosity', models.PositiveIntegerField(default=0, blank=True, choices=[(0, '0 (Normal)'), (1, '1 (Verbose)'), (2, '2 (More Verbose)'), (3, '3 (Debug)'), (4, '4 (Connection Debug)'), (5, '5 (WinRM Debug)')])),
('extra_vars', models.TextField(default='', blank=True)),
('job_tags', models.CharField(default='', max_length=1024, blank=True)),
('force_handlers', models.BooleanField(default=False)),
('skip_tags', models.CharField(default=b'', max_length=1024, blank=True)),
('start_at_task', models.CharField(default=b'', max_length=1024, blank=True)),
('skip_tags', models.CharField(default='', max_length=1024, blank=True)),
('start_at_task', models.CharField(default='', max_length=1024, blank=True)),
('become_enabled', models.BooleanField(default=False)),
('host_config_key', models.CharField(default=b'', max_length=1024, blank=True)),
('host_config_key', models.CharField(default='', max_length=1024, blank=True)),
('ask_variables_on_launch', models.BooleanField(default=False)),
('survey_enabled', models.BooleanField(default=False)),
('survey_spec', jsonfield.fields.JSONField(default={}, blank=True)),
@@ -475,9 +475,9 @@ class Migration(migrations.Migration):
fields=[
('unifiedjobtemplate_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJobTemplate')),
('local_path', models.CharField(help_text='Local path (relative to PROJECTS_ROOT) containing playbooks and related files for this project.', max_length=1024, blank=True)),
('scm_type', models.CharField(default=b'', max_length=8, verbose_name='SCM Type', blank=True, choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion')])),
('scm_url', models.CharField(default=b'', max_length=1024, verbose_name='SCM URL', blank=True)),
('scm_branch', models.CharField(default=b'', help_text='Specific branch, tag or commit to checkout.', max_length=256, verbose_name='SCM Branch', blank=True)),
('scm_type', models.CharField(default='', max_length=8, verbose_name='SCM Type', blank=True, choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion')])),
('scm_url', models.CharField(default='', max_length=1024, verbose_name='SCM URL', blank=True)),
('scm_branch', models.CharField(default='', help_text='Specific branch, tag or commit to checkout.', max_length=256, verbose_name='SCM Branch', blank=True)),
('scm_clean', models.BooleanField(default=False)),
('scm_delete_on_update', models.BooleanField(default=False)),
('scm_delete_on_next_update', models.BooleanField(default=False, editable=False)),
@@ -494,9 +494,9 @@ class Migration(migrations.Migration):
fields=[
('unifiedjob_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('local_path', models.CharField(help_text='Local path (relative to PROJECTS_ROOT) containing playbooks and related files for this project.', max_length=1024, blank=True)),
('scm_type', models.CharField(default=b'', max_length=8, verbose_name='SCM Type', blank=True, choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion')])),
('scm_url', models.CharField(default=b'', max_length=1024, verbose_name='SCM URL', blank=True)),
('scm_branch', models.CharField(default=b'', help_text='Specific branch, tag or commit to checkout.', max_length=256, verbose_name='SCM Branch', blank=True)),
('scm_type', models.CharField(default='', max_length=8, verbose_name='SCM Type', blank=True, choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion')])),
('scm_url', models.CharField(default='', max_length=1024, verbose_name='SCM URL', blank=True)),
('scm_branch', models.CharField(default='', help_text='Specific branch, tag or commit to checkout.', max_length=256, verbose_name='SCM Branch', blank=True)),
('scm_clean', models.BooleanField(default=False)),
('scm_delete_on_update', models.BooleanField(default=False)),
],
@@ -506,8 +506,8 @@ class Migration(migrations.Migration):
name='SystemJob',
fields=[
('unifiedjob_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('job_type', models.CharField(default=b'', max_length=32, blank=True, choices=[(b'cleanup_jobs', 'Remove jobs older than a certain number of days'), (b'cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), (b'cleanup_deleted', 'Purge previously deleted items from the database'), (b'cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')])),
('extra_vars', models.TextField(default=b'', blank=True)),
('job_type', models.CharField(default='', max_length=32, blank=True, choices=[('cleanup_jobs', 'Remove jobs older than a certain number of days'), ('cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), ('cleanup_deleted', 'Purge previously deleted items from the database'), ('cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')])),
('extra_vars', models.TextField(default='', blank=True)),
],
options={
'ordering': ('id',),
@@ -518,7 +518,7 @@ class Migration(migrations.Migration):
name='SystemJobTemplate',
fields=[
('unifiedjobtemplate_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJobTemplate')),
('job_type', models.CharField(default=b'', max_length=32, blank=True, choices=[(b'cleanup_jobs', 'Remove jobs older than a certain number of days'), (b'cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), (b'cleanup_deleted', 'Purge previously deleted items from the database'), (b'cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')])),
('job_type', models.CharField(default='', max_length=32, blank=True, choices=[('cleanup_jobs', 'Remove jobs older than a certain number of days'), ('cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), ('cleanup_deleted', 'Purge previously deleted items from the database'), ('cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')])),
],
bases=('main.unifiedjobtemplate', models.Model),
),

View File

@@ -105,24 +105,24 @@ def create_system_job_templates(apps, schema_editor):
class Migration(migrations.Migration):
replaces = [(b'main', '0002_v300_tower_settings_changes'),
(b'main', '0003_v300_notification_changes'),
(b'main', '0004_v300_fact_changes'),
(b'main', '0005_v300_migrate_facts'),
(b'main', '0006_v300_active_flag_cleanup'),
(b'main', '0007_v300_active_flag_removal'),
(b'main', '0008_v300_rbac_changes'),
(b'main', '0009_v300_rbac_migrations'),
(b'main', '0010_v300_create_system_job_templates'),
(b'main', '0011_v300_credential_domain_field'),
(b'main', '0012_v300_create_labels'),
(b'main', '0013_v300_label_changes'),
(b'main', '0014_v300_invsource_cred'),
(b'main', '0015_v300_label_changes'),
(b'main', '0016_v300_prompting_changes'),
(b'main', '0017_v300_prompting_migrations'),
(b'main', '0018_v300_host_ordering'),
(b'main', '0019_v300_new_azure_credential'),]
replaces = [('main', '0002_v300_tower_settings_changes'),
('main', '0003_v300_notification_changes'),
('main', '0004_v300_fact_changes'),
('main', '0005_v300_migrate_facts'),
('main', '0006_v300_active_flag_cleanup'),
('main', '0007_v300_active_flag_removal'),
('main', '0008_v300_rbac_changes'),
('main', '0009_v300_rbac_migrations'),
('main', '0010_v300_create_system_job_templates'),
('main', '0011_v300_credential_domain_field'),
('main', '0012_v300_create_labels'),
('main', '0013_v300_label_changes'),
('main', '0014_v300_invsource_cred'),
('main', '0015_v300_label_changes'),
('main', '0016_v300_prompting_changes'),
('main', '0017_v300_prompting_migrations'),
('main', '0018_v300_host_ordering'),
('main', '0019_v300_new_azure_credential'),]
dependencies = [
('taggit', '0002_auto_20150616_2121'),
@@ -143,7 +143,7 @@ class Migration(migrations.Migration):
('description', models.TextField()),
('category', models.CharField(max_length=128)),
('value', models.TextField(blank=True)),
('value_type', models.CharField(max_length=12, choices=[(b'string', 'String'), (b'int', 'Integer'), (b'float', 'Decimal'), (b'json', 'JSON'), (b'bool', 'Boolean'), (b'password', 'Password'), (b'list', 'List')])),
('value_type', models.CharField(max_length=12, choices=[('string', 'String'), ('int', 'Integer'), ('float', 'Decimal'), ('json', 'JSON'), ('bool', 'Boolean'), ('password', 'Password'), ('list', 'List')])),
('user', models.ForeignKey(related_name='settings', default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
],
),
@@ -154,12 +154,12 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('status', models.CharField(default=b'pending', max_length=20, editable=False, choices=[(b'pending', 'Pending'), (b'successful', 'Successful'), (b'failed', 'Failed')])),
('error', models.TextField(default=b'', editable=False, blank=True)),
('status', models.CharField(default='pending', max_length=20, editable=False, choices=[('pending', 'Pending'), ('successful', 'Successful'), ('failed', 'Failed')])),
('error', models.TextField(default='', editable=False, blank=True)),
('notifications_sent', models.IntegerField(default=0, editable=False)),
('notification_type', models.CharField(max_length=32, choices=[(b'email', 'Email'), (b'slack', 'Slack'), (b'twilio', 'Twilio'), (b'pagerduty', 'Pagerduty'), (b'hipchat', 'HipChat'), (b'webhook', 'Webhook'), (b'mattermost', 'Mattermost'), (b'rocketchat', 'Rocket.Chat'), (b'irc', 'IRC')])),
('recipients', models.TextField(default=b'', editable=False, blank=True)),
('subject', models.TextField(default=b'', editable=False, blank=True)),
('notification_type', models.CharField(max_length=32, choices=[('email', 'Email'), ('slack', 'Slack'), ('twilio', 'Twilio'), ('pagerduty', 'Pagerduty'), ('hipchat', 'HipChat'), ('webhook', 'Webhook'), ('mattermost', 'Mattermost'), ('rocketchat', 'Rocket.Chat'), ('irc', 'IRC')])),
('recipients', models.TextField(default='', editable=False, blank=True)),
('subject', models.TextField(default='', editable=False, blank=True)),
('body', jsonfield.fields.JSONField(default=dict, blank=True)),
],
options={
@@ -172,9 +172,9 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('name', models.CharField(unique=True, max_length=512)),
('notification_type', models.CharField(max_length=32, choices=[(b'email', 'Email'), (b'slack', 'Slack'), (b'twilio', 'Twilio'), (b'pagerduty', 'Pagerduty'), (b'hipchat', 'HipChat'), (b'webhook', 'Webhook'), (b'mattermost', 'Mattermost'), (b'rocketchat', 'Rocket.Chat'), (b'irc', 'IRC')])),
('notification_type', models.CharField(max_length=32, choices=[('email', 'Email'), ('slack', 'Slack'), ('twilio', 'Twilio'), ('pagerduty', 'Pagerduty'), ('hipchat', 'HipChat'), ('webhook', 'Webhook'), ('mattermost', 'Mattermost'), ('rocketchat', 'Rocket.Chat'), ('irc', 'IRC')])),
('notification_configuration', jsonfield.fields.JSONField(default=dict)),
('created_by', models.ForeignKey(related_name="{u'class': 'notificationtemplate', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('modified_by', models.ForeignKey(related_name="{u'class': 'notificationtemplate', u'app_label': 'main'}(class)s_modified+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
@@ -381,7 +381,7 @@ class Migration(migrations.Migration):
('singleton_name', models.TextField(default=None, unique=True, null=True, db_index=True)),
('members', models.ManyToManyField(related_name='roles', to=settings.AUTH_USER_MODEL)),
('parents', models.ManyToManyField(related_name='children', to='main.Role')),
('implicit_parents', models.TextField(default=b'[]')),
('implicit_parents', models.TextField(default='[]')),
('content_type', models.ForeignKey(default=None, to='contenttypes.ContentType', null=True)),
('object_id', models.PositiveIntegerField(default=None, null=True)),
@@ -422,122 +422,122 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='credential',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'singleton:system_administrator'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['singleton:system_administrator'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='credential',
name='use_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='credential',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'singleton:system_auditor', b'organization.auditor_role', b'use_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['singleton:system_auditor', 'organization.auditor_role', 'use_role', 'admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='custominventoryscript',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'organization.admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='organization.admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='custominventoryscript',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'organization.auditor_role', b'organization.member_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['organization.auditor_role', 'organization.member_role', 'admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='inventory',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'organization.admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='organization.admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='inventory',
name='adhoc_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='inventory',
name='update_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='inventory',
name='use_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'adhoc_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='adhoc_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='inventory',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'organization.auditor_role', b'update_role', b'use_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['organization.auditor_role', 'update_role', 'use_role', 'admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='jobtemplate',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'project.organization.admin_role', b'inventory.organization.admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['project.organization.admin_role', 'inventory.organization.admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='jobtemplate',
name='execute_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='jobtemplate',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'project.organization.auditor_role', b'inventory.organization.auditor_role', b'execute_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['project.organization.auditor_role', 'inventory.organization.auditor_role', 'execute_role', 'admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='organization',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'singleton:system_administrator', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='singleton:system_administrator', to='main.Role', null='True'),
),
migrations.AddField(
model_name='organization',
name='auditor_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'singleton:system_auditor', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='singleton:system_auditor', to='main.Role', null='True'),
),
migrations.AddField(
model_name='organization',
name='member_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='organization',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'member_role', b'auditor_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['member_role', 'auditor_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='project',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'organization.admin_role', b'singleton:system_administrator'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['organization.admin_role', 'singleton:system_administrator'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='project',
name='use_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='project',
name='update_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='project',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'organization.auditor_role', b'singleton:system_auditor', b'use_role', b'update_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['organization.auditor_role', 'singleton:system_auditor', 'use_role', 'update_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='team',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'organization.admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='organization.admin_role', to='main.Role', null='True'),
),
migrations.AddField(
model_name='team',
name='member_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=None, to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=None, to='main.Role', null='True'),
),
migrations.AddField(
model_name='team',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'admin_role', b'organization.auditor_role', b'member_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['admin_role', 'organization.auditor_role', 'member_role'], to='main.Role', null='True'),
),
# System Job Templates
@@ -545,18 +545,18 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='systemjob',
name='job_type',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'cleanup_jobs', 'Remove jobs older than a certain number of days'), (b'cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), (b'cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('cleanup_jobs', 'Remove jobs older than a certain number of days'), ('cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), ('cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')]),
),
migrations.AlterField(
model_name='systemjobtemplate',
name='job_type',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'cleanup_jobs', 'Remove jobs older than a certain number of days'), (b'cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), (b'cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('cleanup_jobs', 'Remove jobs older than a certain number of days'), ('cleanup_activitystream', 'Remove activity stream entries older than a certain number of days'), ('cleanup_facts', 'Purge and/or reduce the granularity of system tracking data')]),
),
# Credential domain field
migrations.AddField(
model_name='credential',
name='domain',
field=models.CharField(default=b'', help_text='The identifier for the domain.', max_length=100, verbose_name='Domain', blank=True),
field=models.CharField(default='', help_text='The identifier for the domain.', max_length=100, verbose_name='Domain', blank=True),
),
# Create Labels
migrations.CreateModel(
@@ -565,7 +565,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('name', models.CharField(max_length=512)),
('created_by', models.ForeignKey(related_name="{u'class': 'label', u'app_label': 'main'}(class)s_created+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
('modified_by', models.ForeignKey(related_name="{u'class': 'label', u'app_label': 'main'}(class)s_modified+", on_delete=django.db.models.deletion.SET_NULL, default=None, editable=False, to=settings.AUTH_USER_MODEL, null=True)),
@@ -625,7 +625,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='credential',
name='authorize_password',
field=models.CharField(default=b'', help_text='Password used by the authorize mechanism.', max_length=1024, blank=True),
field=models.CharField(default='', help_text='Password used by the authorize mechanism.', max_length=1024, blank=True),
),
migrations.AlterField(
model_name='credential',
@@ -640,17 +640,17 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='credential',
name='kind',
field=models.CharField(default=b'ssh', max_length=32, choices=[(b'ssh', 'Machine'), (b'net', 'Network'), (b'scm', 'Source Control'), (b'aws', 'Amazon Web Services'), (b'rax', 'Rackspace'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure'), (b'openstack', 'OpenStack')]),
field=models.CharField(default='ssh', max_length=32, choices=[('ssh', 'Machine'), ('net', 'Network'), ('scm', 'Source Control'), ('aws', 'Amazon Web Services'), ('rax', 'Rackspace'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure'), ('openstack', 'OpenStack')]),
),
migrations.AlterField(
model_name='inventorysource',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='team',
@@ -702,41 +702,41 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='credential',
name='client',
field=models.CharField(default=b'', help_text='Client Id or Application Id for the credential', max_length=128, blank=True),
field=models.CharField(default='', help_text='Client Id or Application Id for the credential', max_length=128, blank=True),
),
migrations.AddField(
model_name='credential',
name='secret',
field=models.CharField(default=b'', help_text='Secret Token for this credential', max_length=1024, blank=True),
field=models.CharField(default='', help_text='Secret Token for this credential', max_length=1024, blank=True),
),
migrations.AddField(
model_name='credential',
name='subscription',
field=models.CharField(default=b'', help_text='Subscription identifier for this credential', max_length=1024, blank=True),
field=models.CharField(default='', help_text='Subscription identifier for this credential', max_length=1024, blank=True),
),
migrations.AddField(
model_name='credential',
name='tenant',
field=models.CharField(default=b'', help_text='Tenant identifier for this credential', max_length=1024, blank=True),
field=models.CharField(default='', help_text='Tenant identifier for this credential', max_length=1024, blank=True),
),
migrations.AlterField(
model_name='credential',
name='kind',
field=models.CharField(default=b'ssh', max_length=32, choices=[(b'ssh', 'Machine'), (b'net', 'Network'), (b'scm', 'Source Control'), (b'aws', 'Amazon Web Services'), (b'rax', 'Rackspace'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Satellite 6'), (b'cloudforms', 'CloudForms'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure Classic (deprecated)'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'openstack', 'OpenStack')]),
field=models.CharField(default='ssh', max_length=32, choices=[('ssh', 'Machine'), ('net', 'Network'), ('scm', 'Source Control'), ('aws', 'Amazon Web Services'), ('rax', 'Rackspace'), ('vmware', 'VMware vCenter'), ('satellite6', 'Satellite 6'), ('cloudforms', 'CloudForms'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure Classic (deprecated)'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('openstack', 'OpenStack')]),
),
migrations.AlterField(
model_name='host',
name='instance_id',
field=models.CharField(default=b'', max_length=1024, blank=True),
field=models.CharField(default='', max_length=1024, blank=True),
),
migrations.AlterField(
model_name='inventorysource',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure Classic (deprecated)'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Satellite 6'), (b'cloudforms', 'CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure Classic (deprecated)'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Satellite 6'), ('cloudforms', 'CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure Classic (deprecated)'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Satellite 6'), (b'cloudforms', 'CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure Classic (deprecated)'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Satellite 6'), ('cloudforms', 'CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
]

View File

@@ -9,20 +9,20 @@ from django.db import migrations, models
from django.conf import settings
import awx.main.fields
import _squashed
from _squashed_30 import SQUASHED_30
from . import _squashed
from ._squashed_30 import SQUASHED_30
class Migration(migrations.Migration):
replaces = [(b'main', '0020_v300_labels_changes'),
(b'main', '0021_v300_activity_stream'),
(b'main', '0022_v300_adhoc_extravars'),
(b'main', '0023_v300_activity_stream_ordering'),
(b'main', '0024_v300_jobtemplate_allow_simul'),
(b'main', '0025_v300_update_rbac_parents'),
(b'main', '0026_v300_credential_unique'),
(b'main', '0027_v300_team_migrations'),
(b'main', '0028_v300_org_team_cascade')] + _squashed.replaces(SQUASHED_30, applied=True)
replaces = [('main', '0020_v300_labels_changes'),
('main', '0021_v300_activity_stream'),
('main', '0022_v300_adhoc_extravars'),
('main', '0023_v300_activity_stream_ordering'),
('main', '0024_v300_jobtemplate_allow_simul'),
('main', '0025_v300_update_rbac_parents'),
('main', '0026_v300_credential_unique'),
('main', '0027_v300_team_migrations'),
('main', '0028_v300_org_team_cascade')] + _squashed.replaces(SQUASHED_30, applied=True)
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
@@ -63,22 +63,22 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='adhoccommand',
name='extra_vars',
field=models.TextField(default=b'', blank=True),
field=models.TextField(default='', blank=True),
),
migrations.AlterField(
model_name='credential',
name='kind',
field=models.CharField(default=b'ssh', max_length=32, choices=[(b'ssh', 'Machine'), (b'net', 'Network'), (b'scm', 'Source Control'), (b'aws', 'Amazon Web Services'), (b'rax', 'Rackspace'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure Classic (deprecated)'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'openstack', 'OpenStack')]),
field=models.CharField(default='ssh', max_length=32, choices=[('ssh', 'Machine'), ('net', 'Network'), ('scm', 'Source Control'), ('aws', 'Amazon Web Services'), ('rax', 'Rackspace'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure Classic (deprecated)'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('openstack', 'OpenStack')]),
),
migrations.AlterField(
model_name='inventorysource',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure Classic (deprecated)'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure Classic (deprecated)'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'Local File, Directory or Script'), (b'rax', 'Rackspace Cloud Servers'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure', 'Microsoft Azure Classic (deprecated)'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'Local File, Directory or Script'), ('rax', 'Rackspace Cloud Servers'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure', 'Microsoft Azure Classic (deprecated)'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
# jobtemplate allow simul
migrations.AddField(
@@ -90,17 +90,17 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='credential',
name='use_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'organization.admin_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['organization.admin_role', 'admin_role'], to='main.Role', null='True'),
),
migrations.AlterField(
model_name='team',
name='member_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'admin_role', to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role='admin_role', to='main.Role', null='True'),
),
migrations.AlterField(
model_name='team',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'organization.auditor_role', b'member_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['organization.auditor_role', 'member_role'], to='main.Role', null='True'),
),
# Unique credential
migrations.AlterUniqueTogether(
@@ -110,7 +110,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='credential',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'singleton:system_auditor', b'organization.auditor_role', b'use_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['singleton:system_auditor', 'organization.auditor_role', 'use_role', 'admin_role'], to='main.Role', null='True'),
),
# Team cascade
migrations.AlterField(

View File

@@ -8,8 +8,8 @@ import django.db.models.deletion
import awx.main.models.workflow
import awx.main.fields
import _squashed
from _squashed_30 import SQUASHED_30
from . import _squashed
from ._squashed_30 import SQUASHED_30
class Migration(migrations.Migration):
@@ -19,7 +19,7 @@ class Migration(migrations.Migration):
]
replaces = _squashed.replaces(SQUASHED_30) + [
(b'main', '0034_v310_release'),
('main', '0034_v310_release'),
]
operations = _squashed.operations(SQUASHED_30) + [
@@ -42,13 +42,13 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='jobevent',
name='uuid',
field=models.CharField(default=b'', max_length=1024, editable=False),
field=models.CharField(default='', max_length=1024, editable=False),
),
# Job Parent Event UUID
migrations.AddField(
model_name='jobevent',
name='parent_uuid',
field=models.CharField(default=b'', max_length=1024, editable=False),
field=models.CharField(default='', max_length=1024, editable=False),
),
# Modify the HA Instance
migrations.RemoveField(
@@ -63,19 +63,19 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='credential',
name='become_method',
field=models.CharField(default=b'', help_text='Privilege escalation method.', max_length=32, blank=True, choices=[(b'', 'None'), (b'sudo', 'Sudo'), (b'su', 'Su'), (b'pbrun', 'Pbrun'), (b'pfexec', 'Pfexec'), (b'dzdo', 'DZDO'), (b'pmrun', 'Pmrun')]),
field=models.CharField(default='', help_text='Privilege escalation method.', max_length=32, blank=True, choices=[('', 'None'), ('sudo', 'Sudo'), ('su', 'Su'), ('pbrun', 'Pbrun'), ('pfexec', 'Pfexec'), ('dzdo', 'DZDO'), ('pmrun', 'Pmrun')]),
),
# Add Workflows
migrations.AlterField(
model_name='unifiedjob',
name='launch_type',
field=models.CharField(default=b'manual', max_length=20, editable=False, choices=[(b'manual', 'Manual'), (b'relaunch', 'Relaunch'), (b'callback', 'Callback'), (b'scheduled', 'Scheduled'), (b'dependency', 'Dependency'), (b'workflow', 'Workflow'), (b'sync', 'Sync')]),
field=models.CharField(default='manual', max_length=20, editable=False, choices=[('manual', 'Manual'), ('relaunch', 'Relaunch'), ('callback', 'Callback'), ('scheduled', 'Scheduled'), ('dependency', 'Dependency'), ('workflow', 'Workflow'), ('sync', 'Sync')]),
),
migrations.CreateModel(
name='WorkflowJob',
fields=[
('unifiedjob_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('extra_vars', models.TextField(default=b'', blank=True)),
('extra_vars', models.TextField(default='', blank=True)),
],
options={
'ordering': ('id',),
@@ -101,8 +101,8 @@ class Migration(migrations.Migration):
name='WorkflowJobTemplate',
fields=[
('unifiedjobtemplate_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='main.UnifiedJobTemplate')),
('extra_vars', models.TextField(default=b'', blank=True)),
('admin_role', awx.main.fields.ImplicitRoleField(related_name='+', parent_role=b'singleton:system_administrator', to='main.Role', null=b'True')),
('extra_vars', models.TextField(default='', blank=True)),
('admin_role', awx.main.fields.ImplicitRoleField(related_name='+', parent_role='singleton:system_administrator', to='main.Role', null='True')),
],
bases=('main.unifiedjobtemplate', models.Model),
),
@@ -176,7 +176,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='workflowjobtemplate',
name='execute_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='workflowjobtemplate',
@@ -186,7 +186,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='workflowjobtemplate',
name='read_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'singleton:system_auditor', b'organization.auditor_role', b'execute_role', b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['singleton:system_auditor', 'organization.auditor_role', 'execute_role', 'admin_role'], to='main.Role', null='True'),
),
migrations.AddField(
model_name='workflowjobtemplatenode',
@@ -216,7 +216,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='workflowjobtemplate',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'singleton:system_administrator', b'organization.admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['singleton:system_administrator', 'organization.admin_role'], to='main.Role', null='True'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
@@ -269,23 +269,23 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='unifiedjob',
name='execution_node',
field=models.TextField(default=b'', editable=False, blank=True),
field=models.TextField(default='', editable=False, blank=True),
),
# SCM Revision
migrations.AddField(
model_name='project',
name='scm_revision',
field=models.CharField(default=b'', editable=False, max_length=1024, blank=True, help_text='The last revision fetched by a project update', verbose_name='SCM Revision'),
field=models.CharField(default='', editable=False, max_length=1024, blank=True, help_text='The last revision fetched by a project update', verbose_name='SCM Revision'),
),
migrations.AddField(
model_name='projectupdate',
name='job_type',
field=models.CharField(default=b'check', max_length=64, choices=[(b'run', 'Run'), (b'check', 'Check')]),
field=models.CharField(default='check', max_length=64, choices=[('run', 'Run'), ('check', 'Check')]),
),
migrations.AddField(
model_name='job',
name='scm_revision',
field=models.CharField(default=b'', editable=False, max_length=1024, blank=True, help_text='The SCM Revision from the Project used for this job, if available', verbose_name='SCM Revision'),
field=models.CharField(default='', editable=False, max_length=1024, blank=True, help_text='The SCM Revision from the Project used for this job, if available', verbose_name='SCM Revision'),
),
# Project Playbook Files
migrations.AddField(
@@ -307,12 +307,12 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='adhoccommandevent',
name='stdout',
field=models.TextField(default=b'', editable=False),
field=models.TextField(default='', editable=False),
),
migrations.AddField(
model_name='adhoccommandevent',
name='uuid',
field=models.CharField(default=b'', max_length=1024, editable=False),
field=models.CharField(default='', max_length=1024, editable=False),
),
migrations.AddField(
model_name='adhoccommandevent',
@@ -327,7 +327,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='jobevent',
name='playbook',
field=models.CharField(default=b'', max_length=1024, editable=False),
field=models.CharField(default='', max_length=1024, editable=False),
),
migrations.AddField(
model_name='jobevent',
@@ -337,7 +337,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='jobevent',
name='stdout',
field=models.TextField(default=b'', editable=False),
field=models.TextField(default='', editable=False),
),
migrations.AddField(
model_name='jobevent',
@@ -352,7 +352,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='adhoccommandevent',
name='event',
field=models.CharField(max_length=100, choices=[(b'runner_on_failed', 'Host Failed'), (b'runner_on_ok', 'Host OK'), (b'runner_on_unreachable', 'Host Unreachable'), (b'runner_on_skipped', 'Host Skipped'), (b'debug', 'Debug'), (b'verbose', 'Verbose'), (b'deprecated', 'Deprecated'), (b'warning', 'Warning'), (b'system_warning', 'System Warning'), (b'error', 'Error')]),
field=models.CharField(max_length=100, choices=[('runner_on_failed', 'Host Failed'), ('runner_on_ok', 'Host OK'), ('runner_on_unreachable', 'Host Unreachable'), ('runner_on_skipped', 'Host Skipped'), ('debug', 'Debug'), ('verbose', 'Verbose'), ('deprecated', 'Deprecated'), ('warning', 'Warning'), ('system_warning', 'System Warning'), ('error', 'Error')]),
),
migrations.AlterField(
model_name='jobevent',
@@ -362,7 +362,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='jobevent',
name='event',
field=models.CharField(max_length=100, choices=[(b'runner_on_failed', 'Host Failed'), (b'runner_on_ok', 'Host OK'), (b'runner_on_error', 'Host Failure'), (b'runner_on_skipped', 'Host Skipped'), (b'runner_on_unreachable', 'Host Unreachable'), (b'runner_on_no_hosts', 'No Hosts Remaining'), (b'runner_on_async_poll', 'Host Polling'), (b'runner_on_async_ok', 'Host Async OK'), (b'runner_on_async_failed', 'Host Async Failure'), (b'runner_item_on_ok', 'Item OK'), (b'runner_item_on_failed', 'Item Failed'), (b'runner_item_on_skipped', 'Item Skipped'), (b'runner_retry', 'Host Retry'), (b'runner_on_file_diff', 'File Difference'), (b'playbook_on_start', 'Playbook Started'), (b'playbook_on_notify', 'Running Handlers'), (b'playbook_on_include', 'Including File'), (b'playbook_on_no_hosts_matched', 'No Hosts Matched'), (b'playbook_on_no_hosts_remaining', 'No Hosts Remaining'), (b'playbook_on_task_start', 'Task Started'), (b'playbook_on_vars_prompt', 'Variables Prompted'), (b'playbook_on_setup', 'Gathering Facts'), (b'playbook_on_import_for_host', 'internal: on Import for Host'), (b'playbook_on_not_import_for_host', 'internal: on Not Import for Host'), (b'playbook_on_play_start', 'Play Started'), (b'playbook_on_stats', 'Playbook Complete'), (b'debug', 'Debug'), (b'verbose', 'Verbose'), (b'deprecated', 'Deprecated'), (b'warning', 'Warning'), (b'system_warning', 'System Warning'), (b'error', 'Error')]),
field=models.CharField(max_length=100, choices=[('runner_on_failed', 'Host Failed'), ('runner_on_ok', 'Host OK'), ('runner_on_error', 'Host Failure'), ('runner_on_skipped', 'Host Skipped'), ('runner_on_unreachable', 'Host Unreachable'), ('runner_on_no_hosts', 'No Hosts Remaining'), ('runner_on_async_poll', 'Host Polling'), ('runner_on_async_ok', 'Host Async OK'), ('runner_on_async_failed', 'Host Async Failure'), ('runner_item_on_ok', 'Item OK'), ('runner_item_on_failed', 'Item Failed'), ('runner_item_on_skipped', 'Item Skipped'), ('runner_retry', 'Host Retry'), ('runner_on_file_diff', 'File Difference'), ('playbook_on_start', 'Playbook Started'), ('playbook_on_notify', 'Running Handlers'), ('playbook_on_include', 'Including File'), ('playbook_on_no_hosts_matched', 'No Hosts Matched'), ('playbook_on_no_hosts_remaining', 'No Hosts Remaining'), ('playbook_on_task_start', 'Task Started'), ('playbook_on_vars_prompt', 'Variables Prompted'), ('playbook_on_setup', 'Gathering Facts'), ('playbook_on_import_for_host', 'internal: on Import for Host'), ('playbook_on_not_import_for_host', 'internal: on Not Import for Host'), ('playbook_on_play_start', 'Play Started'), ('playbook_on_stats', 'Playbook Complete'), ('debug', 'Debug'), ('verbose', 'Verbose'), ('deprecated', 'Deprecated'), ('warning', 'Warning'), ('system_warning', 'System Warning'), ('error', 'Error')]),
),
migrations.AlterUniqueTogether(
name='adhoccommandevent',
@@ -505,7 +505,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='host',
name='instance_id',
field=models.CharField(default=b'', help_text='The value used by the remote inventory source to uniquely identify the host', max_length=1024, blank=True),
field=models.CharField(default='', help_text='The value used by the remote inventory source to uniquely identify the host', max_length=1024, blank=True),
),
migrations.AlterField(
model_name='project',
@@ -520,7 +520,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='project',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
field=models.CharField(default='', choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
migrations.AlterField(
model_name='project',
@@ -535,7 +535,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='project',
name='scm_url',
field=models.CharField(default=b'', help_text='The location where the project is stored.', max_length=1024, verbose_name='SCM URL', blank=True),
field=models.CharField(default='', help_text='The location where the project is stored.', max_length=1024, verbose_name='SCM URL', blank=True),
),
migrations.AlterField(
model_name='project',
@@ -555,12 +555,12 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='projectupdate',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
field=models.CharField(default='', choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
migrations.AlterField(
model_name='projectupdate',
name='scm_url',
field=models.CharField(default=b'', help_text='The location where the project is stored.', max_length=1024, verbose_name='SCM URL', blank=True),
field=models.CharField(default='', help_text='The location where the project is stored.', max_length=1024, verbose_name='SCM URL', blank=True),
),
migrations.AlterField(
model_name='projectupdate',
@@ -600,7 +600,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='unifiedjob',
name='execution_node',
field=models.TextField(default=b'', help_text='The Tower node the job executed on.', editable=False, blank=True),
field=models.TextField(default='', help_text='The Tower node the job executed on.', editable=False, blank=True),
),
migrations.AlterField(
model_name='unifiedjob',
@@ -610,7 +610,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='unifiedjob',
name='job_explanation',
field=models.TextField(default=b'', help_text="A status field to indicate the state of the job if it wasn't able to run and capture stdout", editable=False, blank=True),
field=models.TextField(default='', help_text="A status field to indicate the state of the job if it wasn't able to run and capture stdout", editable=False, blank=True),
),
migrations.AlterField(
model_name='unifiedjob',

View File

@@ -2,8 +2,8 @@
from __future__ import unicode_literals
from django.db import migrations
import _squashed
from _squashed_31 import SQUASHED_31
from . import _squashed
from ._squashed_31 import SQUASHED_31
class Migration(migrations.Migration):

View File

@@ -72,7 +72,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='inventory',
name='kind',
field=models.CharField(default=b'', help_text='Kind of inventory being represented.', max_length=32, blank=True, choices=[(b'', 'Hosts have a direct link to this inventory.'), (b'smart', 'Hosts for inventory generated using the host_filter property.')]),
field=models.CharField(default='', help_text='Kind of inventory being represented.', max_length=32, blank=True, choices=[('', 'Hosts have a direct link to this inventory.'), ('smart', 'Hosts for inventory generated using the host_filter property.')]),
),
migrations.CreateModel(
name='SmartInventoryMembership',
@@ -143,7 +143,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='inventorysource',
name='scm_last_revision',
field=models.CharField(default=b'', max_length=1024, editable=False, blank=True),
field=models.CharField(default='', max_length=1024, editable=False, blank=True),
),
migrations.AddField(
model_name='inventorysource',
@@ -163,27 +163,27 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='inventorysource',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'File, Directory or Script'), (b'scm', 'Sourced from a Project'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'File, Directory or Script'), ('scm', 'Sourced from a Project'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'File, Directory or Script'), (b'scm', 'Sourced from a Project'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'File, Directory or Script'), ('scm', 'Sourced from a Project'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='inventorysource',
name='source_path',
field=models.CharField(default=b'', max_length=1024, blank=True),
field=models.CharField(default='', max_length=1024, blank=True),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source_path',
field=models.CharField(default=b'', max_length=1024, blank=True),
field=models.CharField(default='', max_length=1024, blank=True),
),
migrations.AlterField(
model_name='unifiedjob',
name='launch_type',
field=models.CharField(default=b'manual', max_length=20, editable=False, choices=[(b'manual', 'Manual'), (b'relaunch', 'Relaunch'), (b'callback', 'Callback'), (b'scheduled', 'Scheduled'), (b'dependency', 'Dependency'), (b'workflow', 'Workflow'), (b'sync', 'Sync'), (b'scm', 'SCM Update')]),
field=models.CharField(default='manual', max_length=20, editable=False, choices=[('manual', 'Manual'), ('relaunch', 'Relaunch'), ('callback', 'Callback'), ('scheduled', 'Scheduled'), ('dependency', 'Dependency'), ('workflow', 'Workflow'), ('sync', 'Sync'), ('scm', 'SCM Update')]),
),
migrations.AddField(
model_name='inventorysource',
@@ -211,12 +211,12 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='inventorysource',
name='verbosity',
field=models.PositiveIntegerField(default=1, blank=True, choices=[(0, b'0 (WARNING)'), (1, b'1 (INFO)'), (2, b'2 (DEBUG)')]),
field=models.PositiveIntegerField(default=1, blank=True, choices=[(0, '0 (WARNING)'), (1, '1 (INFO)'), (2, '2 (DEBUG)')]),
),
migrations.AddField(
model_name='inventoryupdate',
name='verbosity',
field=models.PositiveIntegerField(default=1, blank=True, choices=[(0, b'0 (WARNING)'), (1, b'1 (INFO)'), (2, b'2 (DEBUG)')]),
field=models.PositiveIntegerField(default=1, blank=True, choices=[(0, '0 (WARNING)'), (1, '1 (INFO)'), (2, '2 (DEBUG)')]),
),
# Job Templates
@@ -317,7 +317,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='inventory',
name='kind',
field=models.CharField(default=b'', help_text='Kind of inventory being represented.', max_length=32, blank=True, choices=[(b'', 'Hosts have a direct link to this inventory.'), (b'smart', 'Hosts for inventory generated using the host_filter property.')]),
field=models.CharField(default='', help_text='Kind of inventory being represented.', max_length=32, blank=True, choices=[('', 'Hosts have a direct link to this inventory.'), ('smart', 'Hosts for inventory generated using the host_filter property.')]),
),
# Timeout help text update
@@ -378,9 +378,9 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('description', models.TextField(default=b'', blank=True)),
('description', models.TextField(default='', blank=True)),
('name', models.CharField(max_length=512)),
('kind', models.CharField(max_length=32, choices=[(b'ssh', 'Machine'), (b'vault', 'Vault'), (b'net', 'Network'), (b'scm', 'Source Control'), (b'cloud', 'Cloud'), (b'insights', 'Insights')])),
('kind', models.CharField(max_length=32, choices=[('ssh', 'Machine'), ('vault', 'Vault'), ('net', 'Network'), ('scm', 'Source Control'), ('cloud', 'Cloud'), ('insights', 'Insights')])),
('managed_by_tower', models.BooleanField(default=False, editable=False)),
('inputs', awx.main.fields.CredentialTypeInputField(default={}, blank=True, help_text='Enter inputs using either JSON or YAML syntax. Use the radio button to toggle between the two. Refer to the Ansible Tower documentation for example syntax.')),
('injectors', awx.main.fields.CredentialTypeInjectorField(default={}, blank=True, help_text='Enter injectors using either JSON or YAML syntax. Use the radio button to toggle between the two. Refer to the Ansible Tower documentation for example syntax.')),
@@ -435,7 +435,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='credential',
name='become_method',
field=models.CharField(default=b'', help_text='Privilege escalation method.', max_length=32, blank=True, choices=[(b'', 'None'), (b'sudo', 'Sudo'), (b'su', 'Su'), (b'pbrun', 'Pbrun'), (b'pfexec', 'Pfexec'), (b'dzdo', 'DZDO'), (b'pmrun', 'Pmrun'), (b'runas', 'Runas')]),
field=models.CharField(default='', help_text='Privilege escalation method.', max_length=32, blank=True, choices=[('', 'None'), ('sudo', 'Sudo'), ('su', 'Su'), ('pbrun', 'Pbrun'), ('pfexec', 'Pfexec'), ('dzdo', 'DZDO'), ('pmrun', 'Pmrun'), ('runas', 'Runas')]),
),
# Connecting activity stream
@@ -496,6 +496,6 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='unifiedjob',
name='execution_node',
field=models.TextField(default=b'', help_text='The node the job executed on.', editable=False, blank=True),
field=models.TextField(default='', help_text='The node the job executed on.', editable=False, blank=True),
),
]

View File

@@ -20,11 +20,11 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='inventorysource',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'File, Directory or Script'), (b'scm', 'Sourced from a Project'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'rhv', 'Red Hat Virtualization'), (b'tower', 'Ansible Tower'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'File, Directory or Script'), ('scm', 'Sourced from a Project'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('rhv', 'Red Hat Virtualization'), ('tower', 'Ansible Tower'), ('custom', 'Custom Script')]),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source',
field=models.CharField(default=b'', max_length=32, blank=True, choices=[(b'', 'Manual'), (b'file', 'File, Directory or Script'), (b'scm', 'Sourced from a Project'), (b'ec2', 'Amazon EC2'), (b'gce', 'Google Compute Engine'), (b'azure_rm', 'Microsoft Azure Resource Manager'), (b'vmware', 'VMware vCenter'), (b'satellite6', 'Red Hat Satellite 6'), (b'cloudforms', 'Red Hat CloudForms'), (b'openstack', 'OpenStack'), (b'rhv', 'Red Hat Virtualization'), (b'tower', 'Ansible Tower'), (b'custom', 'Custom Script')]),
field=models.CharField(default='', max_length=32, blank=True, choices=[('', 'Manual'), ('file', 'File, Directory or Script'), ('scm', 'Sourced from a Project'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('rhv', 'Red Hat Virtualization'), ('tower', 'Ansible Tower'), ('custom', 'Custom Script')]),
),
]

View File

@@ -21,9 +21,9 @@ class Migration(migrations.Migration):
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('event_data', awx.main.fields.JSONField(blank=True, default={})),
('uuid', models.CharField(default=b'', editable=False, max_length=1024)),
('uuid', models.CharField(default='', editable=False, max_length=1024)),
('counter', models.PositiveIntegerField(default=0, editable=False)),
('stdout', models.TextField(default=b'', editable=False)),
('stdout', models.TextField(default='', editable=False)),
('verbosity', models.PositiveIntegerField(default=0, editable=False)),
('start_line', models.PositiveIntegerField(default=0, editable=False)),
('end_line', models.PositiveIntegerField(default=0, editable=False)),
@@ -39,17 +39,17 @@ class Migration(migrations.Migration):
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('event', models.CharField(choices=[(b'runner_on_failed', 'Host Failed'), (b'runner_on_ok', 'Host OK'), (b'runner_on_error', 'Host Failure'), (b'runner_on_skipped', 'Host Skipped'), (b'runner_on_unreachable', 'Host Unreachable'), (b'runner_on_no_hosts', 'No Hosts Remaining'), (b'runner_on_async_poll', 'Host Polling'), (b'runner_on_async_ok', 'Host Async OK'), (b'runner_on_async_failed', 'Host Async Failure'), (b'runner_item_on_ok', 'Item OK'), (b'runner_item_on_failed', 'Item Failed'), (b'runner_item_on_skipped', 'Item Skipped'), (b'runner_retry', 'Host Retry'), (b'runner_on_file_diff', 'File Difference'), (b'playbook_on_start', 'Playbook Started'), (b'playbook_on_notify', 'Running Handlers'), (b'playbook_on_include', 'Including File'), (b'playbook_on_no_hosts_matched', 'No Hosts Matched'), (b'playbook_on_no_hosts_remaining', 'No Hosts Remaining'), (b'playbook_on_task_start', 'Task Started'), (b'playbook_on_vars_prompt', 'Variables Prompted'), (b'playbook_on_setup', 'Gathering Facts'), (b'playbook_on_import_for_host', 'internal: on Import for Host'), (b'playbook_on_not_import_for_host', 'internal: on Not Import for Host'), (b'playbook_on_play_start', 'Play Started'), (b'playbook_on_stats', 'Playbook Complete'), (b'debug', 'Debug'), (b'verbose', 'Verbose'), (b'deprecated', 'Deprecated'), (b'warning', 'Warning'), (b'system_warning', 'System Warning'), (b'error', 'Error')], max_length=100)),
('event', models.CharField(choices=[('runner_on_failed', 'Host Failed'), ('runner_on_ok', 'Host OK'), ('runner_on_error', 'Host Failure'), ('runner_on_skipped', 'Host Skipped'), ('runner_on_unreachable', 'Host Unreachable'), ('runner_on_no_hosts', 'No Hosts Remaining'), ('runner_on_async_poll', 'Host Polling'), ('runner_on_async_ok', 'Host Async OK'), ('runner_on_async_failed', 'Host Async Failure'), ('runner_item_on_ok', 'Item OK'), ('runner_item_on_failed', 'Item Failed'), ('runner_item_on_skipped', 'Item Skipped'), ('runner_retry', 'Host Retry'), ('runner_on_file_diff', 'File Difference'), ('playbook_on_start', 'Playbook Started'), ('playbook_on_notify', 'Running Handlers'), ('playbook_on_include', 'Including File'), ('playbook_on_no_hosts_matched', 'No Hosts Matched'), ('playbook_on_no_hosts_remaining', 'No Hosts Remaining'), ('playbook_on_task_start', 'Task Started'), ('playbook_on_vars_prompt', 'Variables Prompted'), ('playbook_on_setup', 'Gathering Facts'), ('playbook_on_import_for_host', 'internal: on Import for Host'), ('playbook_on_not_import_for_host', 'internal: on Not Import for Host'), ('playbook_on_play_start', 'Play Started'), ('playbook_on_stats', 'Playbook Complete'), ('debug', 'Debug'), ('verbose', 'Verbose'), ('deprecated', 'Deprecated'), ('warning', 'Warning'), ('system_warning', 'System Warning'), ('error', 'Error')], max_length=100)),
('event_data', awx.main.fields.JSONField(blank=True, default={})),
('failed', models.BooleanField(default=False, editable=False)),
('changed', models.BooleanField(default=False, editable=False)),
('uuid', models.CharField(default=b'', editable=False, max_length=1024)),
('playbook', models.CharField(default=b'', editable=False, max_length=1024)),
('play', models.CharField(default=b'', editable=False, max_length=1024)),
('role', models.CharField(default=b'', editable=False, max_length=1024)),
('task', models.CharField(default=b'', editable=False, max_length=1024)),
('uuid', models.CharField(default='', editable=False, max_length=1024)),
('playbook', models.CharField(default='', editable=False, max_length=1024)),
('play', models.CharField(default='', editable=False, max_length=1024)),
('role', models.CharField(default='', editable=False, max_length=1024)),
('task', models.CharField(default='', editable=False, max_length=1024)),
('counter', models.PositiveIntegerField(default=0, editable=False)),
('stdout', models.TextField(default=b'', editable=False)),
('stdout', models.TextField(default='', editable=False)),
('verbosity', models.PositiveIntegerField(default=0, editable=False)),
('start_line', models.PositiveIntegerField(default=0, editable=False)),
('end_line', models.PositiveIntegerField(default=0, editable=False)),
@@ -66,9 +66,9 @@ class Migration(migrations.Migration):
('created', models.DateTimeField(default=None, editable=False)),
('modified', models.DateTimeField(default=None, editable=False)),
('event_data', awx.main.fields.JSONField(blank=True, default={})),
('uuid', models.CharField(default=b'', editable=False, max_length=1024)),
('uuid', models.CharField(default='', editable=False, max_length=1024)),
('counter', models.PositiveIntegerField(default=0, editable=False)),
('stdout', models.TextField(default=b'', editable=False)),
('stdout', models.TextField(default='', editable=False)),
('verbosity', models.PositiveIntegerField(default=0, editable=False)),
('start_line', models.PositiveIntegerField(default=0, editable=False)),
('end_line', models.PositiveIntegerField(default=0, editable=False)),

View File

@@ -18,77 +18,77 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='organization',
name='execute_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='organization',
name='job_template_admin_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='organization',
name='credential_admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='organization',
name='inventory_admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='organization',
name='project_admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='organization',
name='workflow_admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='organization',
name='notification_admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='credential',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'singleton:system_administrator', b'organization.credential_admin_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['singleton:system_administrator', 'organization.credential_admin_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='inventory',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=b'organization.inventory_admin_role', related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='organization.inventory_admin_role', related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='project',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'organization.project_admin_role', b'singleton:system_administrator'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['organization.project_admin_role', 'singleton:system_administrator'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='workflowjobtemplate',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'singleton:system_administrator', b'organization.workflow_admin_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['singleton:system_administrator', 'organization.workflow_admin_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='workflowjobtemplate',
name='execute_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'admin_role', b'organization.execute_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['admin_role', 'organization.execute_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='jobtemplate',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'project.organization.job_template_admin_role', b'inventory.organization.job_template_admin_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['project.organization.job_template_admin_role', 'inventory.organization.job_template_admin_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='jobtemplate',
name='execute_role',
field=awx.main.fields.ImplicitRoleField(null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'admin_role', b'project.organization.execute_role', b'inventory.organization.execute_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['admin_role', 'project.organization.execute_role', 'inventory.organization.execute_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='organization',
name='member_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'admin_role', b'execute_role', b'project_admin_role', b'inventory_admin_role', b'workflow_admin_role', b'notification_admin_role', b'credential_admin_role', b'job_template_admin_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['admin_role', 'execute_role', 'project_admin_role', 'inventory_admin_role', 'workflow_admin_role', 'notification_admin_role', 'credential_admin_role', 'job_template_admin_role'], related_name='+', to='main.Role'),
),
]

View File

@@ -35,8 +35,8 @@ class Migration(migrations.Migration):
('skip_authorization', models.BooleanField(default=False)),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('description', models.TextField(blank=True, default=b'')),
('logo_data', models.TextField(default=b'', editable=False, validators=[django.core.validators.RegexValidator(re.compile(b'.*'))])),
('description', models.TextField(blank=True, default='')),
('logo_data', models.TextField(default='', editable=False, validators=[django.core.validators.RegexValidator(re.compile('.*'))])),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='main_oauth2application', to=settings.AUTH_USER_MODEL)),
],
options={
@@ -52,7 +52,7 @@ class Migration(migrations.Migration):
('scope', models.TextField(blank=True)),
('created', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
('description', models.CharField(blank=True, default=b'', max_length=200)),
('description', models.CharField(blank=True, default='', max_length=200)),
('last_used', models.DateTimeField(default=None, editable=False, null=True)),
('application', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.OAUTH2_PROVIDER_APPLICATION_MODEL)),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='main_oauth2accesstoken', to=settings.AUTH_USER_MODEL)),

View File

@@ -20,7 +20,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='oauth2accesstoken',
name='scope',
field=models.TextField(blank=True, default=b'write', help_text="Allowed scopes, further restricts user's permissions."),
field=models.TextField(blank=True, default='write', help_text="Allowed scopes, further restricts user's permissions."),
),
migrations.AlterField(
model_name='oauth2accesstoken',
@@ -30,7 +30,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='oauth2application',
name='authorization_grant_type',
field=models.CharField(choices=[(b'authorization-code', 'Authorization code'), (b'implicit', 'Implicit'), (b'password', 'Resource owner password-based'), (b'client-credentials', 'Client credentials')], help_text='The Grant type the user must use for acquire tokens for this application.', max_length=32),
field=models.CharField(choices=[('authorization-code', 'Authorization code'), ('implicit', 'Implicit'), ('password', 'Resource owner password-based'), ('client-credentials', 'Client credentials')], help_text='The Grant type the user must use for acquire tokens for this application.', max_length=32),
),
migrations.AlterField(
model_name='oauth2application',
@@ -40,7 +40,7 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='oauth2application',
name='client_type',
field=models.CharField(choices=[(b'confidential', 'Confidential'), (b'public', 'Public')], help_text='Set to Public or Confidential depending on how secure the client device is.', max_length=32),
field=models.CharField(choices=[('confidential', 'Confidential'), ('public', 'Public')], help_text='Set to Public or Confidential depending on how secure the client device is.', max_length=32),
),
migrations.AlterField(
model_name='oauth2application',

View File

@@ -16,6 +16,6 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='oauth2accesstoken',
name='scope',
field=models.TextField(blank=True, default=b'write', help_text="Allowed scopes, further restricts user's permissions. Must be a simple space-separated string with allowed scopes ['read', 'write']."),
field=models.TextField(blank=True, default='write', help_text="Allowed scopes, further restricts user's permissions. Must be a simple space-separated string with allowed scopes ['read', 'write']."),
),
]

View File

@@ -15,6 +15,6 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='unifiedjob',
name='controller_node',
field=models.TextField(blank=True, default=b'', editable=False, help_text='The instance that managed the isolated execution environment.'),
field=models.TextField(blank=True, default='', editable=False, help_text='The instance that managed the isolated execution environment.'),
),
]

View File

@@ -18,12 +18,12 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='organization',
name='member_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'admin_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['admin_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='organization',
name='read_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null=b'True', on_delete=django.db.models.deletion.CASCADE, parent_role=[b'member_role', b'auditor_role', b'execute_role', b'project_admin_role', b'inventory_admin_role', b'workflow_admin_role', b'notification_admin_role', b'credential_admin_role', b'job_template_admin_role'], related_name='+', to='main.Role'),
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['member_role', 'auditor_role', 'execute_role', 'project_admin_role', 'inventory_admin_role', 'workflow_admin_role', 'notification_admin_role', 'credential_admin_role', 'job_template_admin_role'], related_name='+', to='main.Role'),
),
migrations.RunPython(rebuild_role_hierarchy),
]

View File

@@ -15,6 +15,6 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='oauth2application',
name='authorization_grant_type',
field=models.CharField(choices=[(b'authorization-code', 'Authorization code'), (b'implicit', 'Implicit'), (b'password', 'Resource owner password-based')], help_text='The Grant type the user must use for acquire tokens for this application.', max_length=32),
field=models.CharField(choices=[('authorization-code', 'Authorization code'), ('implicit', 'Implicit'), ('password', 'Resource owner password-based')], help_text='The Grant type the user must use for acquire tokens for this application.', max_length=32),
),
]

View File

@@ -17,131 +17,131 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='credential',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'credential', u'model_name': 'credential'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'credential', 'model_name': 'credential', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='credential',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'credential', u'model_name': 'credential'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'credential', 'model_name': 'credential', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='credentialtype',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'credentialtype', u'model_name': 'credentialtype'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'credentialtype', 'model_name': 'credentialtype', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='credentialtype',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'credentialtype', u'model_name': 'credentialtype'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'credentialtype', 'model_name': 'credentialtype', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='custominventoryscript',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'custominventoryscript', u'model_name': 'custominventoryscript'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'custominventoryscript', 'model_name': 'custominventoryscript', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='custominventoryscript',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'custominventoryscript', u'model_name': 'custominventoryscript'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'custominventoryscript', 'model_name': 'custominventoryscript', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='group',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'group', u'model_name': 'group'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'group', 'model_name': 'group', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='group',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'group', u'model_name': 'group'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'group', 'model_name': 'group', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='host',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'host', u'model_name': 'host'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'host', 'model_name': 'host', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='host',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'host', u'model_name': 'host'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'host', 'model_name': 'host', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='inventory',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'inventory', u'model_name': 'inventory'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'inventory', 'model_name': 'inventory', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='inventory',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'inventory', u'model_name': 'inventory'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'inventory', 'model_name': 'inventory', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='label',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'label', u'model_name': 'label'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'label', 'model_name': 'label', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='label',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'label', u'model_name': 'label'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'label', 'model_name': 'label', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='notificationtemplate',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'notificationtemplate', u'model_name': 'notificationtemplate'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'notificationtemplate', 'model_name': 'notificationtemplate', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='notificationtemplate',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'notificationtemplate', u'model_name': 'notificationtemplate'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'notificationtemplate', 'model_name': 'notificationtemplate', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='organization',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'organization', u'model_name': 'organization'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'organization', 'model_name': 'organization', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='organization',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'organization', u'model_name': 'organization'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'organization', 'model_name': 'organization', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='schedule',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'schedule', u'model_name': 'schedule'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'schedule', 'model_name': 'schedule', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='schedule',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'schedule', u'model_name': 'schedule'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'schedule', 'model_name': 'schedule', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='team',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'team', u'model_name': 'team'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'team', 'model_name': 'team', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='team',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'team', u'model_name': 'team'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'team', 'model_name': 'team', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='unifiedjob',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'unifiedjob', u'model_name': 'unifiedjob'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'unifiedjob', 'model_name': 'unifiedjob', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='unifiedjob',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'unifiedjob', u'model_name': 'unifiedjob'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'unifiedjob', 'model_name': 'unifiedjob', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='created_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'unifiedjobtemplate', u'model_name': 'unifiedjobtemplate'}(class)s_created+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'unifiedjobtemplate', 'model_name': 'unifiedjobtemplate', 'app_label': 'main'}(class)s_created+", to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='modified_by',
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{u'app_label': 'main', u'class': 'unifiedjobtemplate', u'model_name': 'unifiedjobtemplate'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
field=models.ForeignKey(default=None, editable=False, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name="{'class': 'unifiedjobtemplate', 'model_name': 'unifiedjobtemplate', 'app_label': 'main'}(class)s_modified+", to=settings.AUTH_USER_MODEL),
),
]

View File

@@ -0,0 +1,25 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.16 on 2019-01-20 12:00
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0054_v340_workflow_convergence'),
]
operations = [
migrations.AlterField(
model_name='notification',
name='notification_type',
field=models.CharField(choices=[('email', 'Email'), ('slack', 'Slack'), ('twilio', 'Twilio'), ('pagerduty', 'Pagerduty'), ('grafana', 'Grafana'), ('hipchat', 'HipChat'), ('webhook', 'Webhook'), ('mattermost', 'Mattermost'), ('rocketchat', 'Rocket.Chat'), ('irc', 'IRC')], max_length=32),
),
migrations.AlterField(
model_name='notificationtemplate',
name='notification_type',
field=models.CharField(choices=[('email', 'Email'), ('slack', 'Slack'), ('twilio', 'Twilio'), ('pagerduty', 'Pagerduty'), ('grafana', 'Grafana'), ('hipchat', 'HipChat'), ('webhook', 'Webhook'), ('mattermost', 'Mattermost'), ('rocketchat', 'Rocket.Chat'), ('irc', 'IRC')], max_length=32),
),
]

View File

@@ -45,8 +45,8 @@ def replaces(squashed, applied=False):
'''
squashed_keys, key_index = squash_data(squashed)
if applied:
return [(b'main', key) for key in squashed_keys[:key_index]]
return [(b'main', key) for key in squashed_keys[key_index:]]
return [('main', key) for key in squashed_keys[:key_index]]
return [('main', key) for key in squashed_keys[key_index:]]
def operations(squashed, applied=False):

View File

@@ -42,12 +42,12 @@ SQUASHED_30 = {
migrations.AlterField(
model_name='credential',
name='admin_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'singleton:system_administrator', b'organization.admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['singleton:system_administrator', 'organization.admin_role'], to='main.Role', null='True'),
),
migrations.AlterField(
model_name='credential',
name='use_role',
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=[b'admin_role'], to='main.Role', null=b'True'),
field=awx.main.fields.ImplicitRoleField(related_name='+', parent_role=['admin_role'], to='main.Role', null='True'),
),
],
'0033_v303_v245_host_variable_fix': [

View File

@@ -17,24 +17,24 @@ SQUASHED_31 = {
migrations.AlterField(
model_name='project',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
field=models.CharField(default='', choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion'), ('insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
migrations.AlterField(
model_name='projectupdate',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
field=models.CharField(default='', choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion'), ('insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
],
'0036_v311_insights': [
migrations.AlterField(
model_name='project',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
field=models.CharField(default='', choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion'), ('insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
migrations.AlterField(
model_name='projectupdate',
name='scm_type',
field=models.CharField(default=b'', choices=[(b'', 'Manual'), (b'git', 'Git'), (b'hg', 'Mercurial'), (b'svn', 'Subversion'), (b'insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
field=models.CharField(default='', choices=[('', 'Manual'), ('git', 'Git'), ('hg', 'Mercurial'), ('svn', 'Subversion'), ('insights', 'Red Hat Insights')], max_length=8, blank=True, help_text='Specifies the source control system used to store the project.', verbose_name='SCM Type'),
),
],
'0037_v313_instance_version': [

View File

@@ -134,6 +134,9 @@ User.add_to_class('is_in_enterprise_category', user_is_in_enterprise_category)
def o_auth2_application_get_absolute_url(self, request=None):
# this page does not exist in v1
if request.version == 'v1':
return reverse('api:o_auth2_application_detail', kwargs={'pk': self.pk}) # use default version
return reverse('api:o_auth2_application_detail', kwargs={'pk': self.pk}, request=request)
@@ -141,15 +144,14 @@ OAuth2Application.add_to_class('get_absolute_url', o_auth2_application_get_absol
def o_auth2_token_get_absolute_url(self, request=None):
# this page does not exist in v1
if request.version == 'v1':
return reverse('api:o_auth2_token_detail', kwargs={'pk': self.pk}) # use default version
return reverse('api:o_auth2_token_detail', kwargs={'pk': self.pk}, request=request)
OAuth2AccessToken.add_to_class('get_absolute_url', o_auth2_token_get_absolute_url)
# Import signal handlers only after models have been defined.
import awx.main.signals # noqa
from awx.main.registrar import activity_stream_registrar # noqa
activity_stream_registrar.connect(Organization)
activity_stream_registrar.connect(Inventory)

View File

@@ -7,6 +7,7 @@ from awx.main.fields import JSONField
# Django
from django.db import models
from django.utils.encoding import smart_str
from django.utils.translation import ugettext_lazy as _
__all__ = ['ActivityStream']
@@ -84,9 +85,9 @@ class ActivityStream(models.Model):
if self.actor:
self.deleted_actor = {
'id': self.actor_id,
'username': self.actor.username,
'first_name': self.actor.first_name,
'last_name': self.actor.last_name,
'username': smart_str(self.actor.username),
'first_name': smart_str(self.actor.first_name),
'last_name': smart_str(self.actor.last_name),
}
if 'update_fields' in kwargs and 'deleted_actor' not in kwargs['update_fields']:
kwargs['update_fields'].append('deleted_actor')

View File

@@ -3,7 +3,7 @@
# Python
import logging
from urlparse import urljoin
from urllib.parse import urljoin
# Django
from django.conf import settings
@@ -109,7 +109,7 @@ class AdHocCommand(UnifiedJob, JobNotificationMixin):
return self.limit
def clean_module_name(self):
if type(self.module_name) not in (str, unicode):
if type(self.module_name) is not str:
raise ValidationError(_("Invalid type for ad hoc command"))
module_name = self.module_name.strip() or 'command'
if module_name not in settings.AD_HOC_COMMANDS:
@@ -117,7 +117,7 @@ class AdHocCommand(UnifiedJob, JobNotificationMixin):
return module_name
def clean_module_args(self):
if type(self.module_args) not in (str, unicode):
if type(self.module_args) is not str:
raise ValidationError(_("Invalid type for ad hoc command"))
module_args = self.module_args
if self.module_name in ('command', 'shell') and not module_args:

View File

@@ -92,7 +92,7 @@ class BaseModel(models.Model):
class Meta:
abstract = True
def __unicode__(self):
def __str__(self):
if 'name' in self.__dict__:
return u'%s-%s' % (self.name, self.pk)
else:
@@ -152,7 +152,7 @@ class CreatedModifiedModel(BaseModel):
)
def save(self, *args, **kwargs):
update_fields = kwargs.get('update_fields', [])
update_fields = list(kwargs.get('update_fields', []))
# Manually perform auto_now_add and auto_now logic.
if not self.pk and not self.created:
self.created = now()

View File

@@ -383,6 +383,9 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
super(Credential, self).save(*args, **kwargs)
def encrypt_field(self, field, ask):
if not hasattr(self, field):
return None
encrypted = encrypt_field(self, field, ask=ask)
if encrypted:
self.inputs[field] = encrypted
@@ -413,12 +416,12 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
type_alias = self.credential_type.name
else:
type_alias = self.credential_type_id
if self.kind == 'vault' and self.inputs.get('vault_id', None):
if self.kind == 'vault' and self.has_input('vault_id'):
if display:
fmt_str = six.text_type('{} (id={})')
else:
fmt_str = six.text_type('{}_{}')
return fmt_str.format(type_alias, self.inputs.get('vault_id'))
return fmt_str.format(type_alias, self.get_input('vault_id'))
return six.text_type(type_alias)
@staticmethod
@@ -428,6 +431,29 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
ret[cred.unique_hash()] = cred
return ret
def get_input(self, field_name, **kwargs):
"""
Get an injectable and decrypted value for an input field.
Retrieves the value for a given credential input field name. Return
values for secret input fields are decrypted. If the credential doesn't
have an input value defined for the given field name, an AttributeError
is raised unless a default value is provided.
:param field_name(str): The name of the input field.
:param default(optional[str]): A default return value to use.
"""
if field_name in self.credential_type.secret_fields:
return decrypt_field(self, field_name)
if field_name in self.inputs:
return self.inputs[field_name]
if 'default' in kwargs:
return kwargs['default']
raise AttributeError(field_name)
def has_input(self, field_name):
return field_name in self.inputs and self.inputs[field_name] not in ('', None)
class CredentialType(CommonModelNameNotUnique):
'''
@@ -477,6 +503,9 @@ class CredentialType(CommonModelNameNotUnique):
)
def get_absolute_url(self, request=None):
# Page does not exist in API v1
if request.version == 'v1':
return reverse('api:credential_type_detail', kwargs={'pk': self.pk})
return reverse('api:credential_type_detail', kwargs={'pk': self.pk}, request=request)
@property
@@ -606,8 +635,9 @@ class CredentialType(CommonModelNameNotUnique):
safe_namespace[field_name] = namespace[field_name] = value
continue
value = credential.get_input(field_name)
if field_name in self.secret_fields:
value = decrypt_field(credential, field_name)
safe_namespace[field_name] = '**********'
elif len(value):
safe_namespace[field_name] = value
@@ -627,7 +657,7 @@ class CredentialType(CommonModelNameNotUnique):
data = Template(file_tmpl).render(**namespace)
_, path = tempfile.mkstemp(dir=private_data_dir)
with open(path, 'w') as f:
f.write(data.encode('utf-8'))
f.write(data)
os.chmod(path, stat.S_IRUSR | stat.S_IWUSR)
# determine if filename indicates single file or many
@@ -977,7 +1007,7 @@ def cloudforms(cls):
'label': ugettext_noop('CloudForms URL'),
'type': 'string',
'help_text': ugettext_noop('Enter the URL for the virtual machine that '
'corresponds to your CloudForm instance. '
'corresponds to your CloudForms instance. '
'For example, https://cloudforms.example.org')
}, {
'id': 'username',

View File

@@ -3,25 +3,28 @@ import os
import stat
import tempfile
from awx.main.utils import decrypt_field
from django.conf import settings
def aws(cred, env, private_data_dir):
env['AWS_ACCESS_KEY_ID'] = cred.username
env['AWS_SECRET_ACCESS_KEY'] = decrypt_field(cred, 'password')
if len(cred.security_token) > 0:
env['AWS_SECURITY_TOKEN'] = decrypt_field(cred, 'security_token')
env['AWS_ACCESS_KEY_ID'] = cred.get_input('username', default='')
env['AWS_SECRET_ACCESS_KEY'] = cred.get_input('password', default='')
if cred.has_input('security_token'):
env['AWS_SECURITY_TOKEN'] = cred.get_input('security_token', default='')
def gce(cred, env, private_data_dir):
env['GCE_EMAIL'] = cred.username
env['GCE_PROJECT'] = cred.project
project = cred.get_input('project', default='')
username = cred.get_input('username', default='')
env['GCE_EMAIL'] = username
env['GCE_PROJECT'] = project
json_cred = {
'type': 'service_account',
'private_key': decrypt_field(cred, 'ssh_key_data'),
'client_email': cred.username,
'project_id': cred.project
'private_key': cred.get_input('ssh_key_data', default=''),
'client_email': username,
'project_id': project
}
handle, path = tempfile.mkstemp(dir=private_data_dir)
f = os.fdopen(handle, 'w')
@@ -32,21 +35,25 @@ def gce(cred, env, private_data_dir):
def azure_rm(cred, env, private_data_dir):
if len(cred.client) and len(cred.tenant):
env['AZURE_CLIENT_ID'] = cred.client
env['AZURE_SECRET'] = decrypt_field(cred, 'secret')
env['AZURE_TENANT'] = cred.tenant
env['AZURE_SUBSCRIPTION_ID'] = cred.subscription
client = cred.get_input('client', default='')
tenant = cred.get_input('tenant', default='')
if len(client) and len(tenant):
env['AZURE_CLIENT_ID'] = client
env['AZURE_TENANT'] = tenant
env['AZURE_SECRET'] = cred.get_input('secret', default='')
env['AZURE_SUBSCRIPTION_ID'] = cred.get_input('subscription', default='')
else:
env['AZURE_SUBSCRIPTION_ID'] = cred.subscription
env['AZURE_AD_USER'] = cred.username
env['AZURE_PASSWORD'] = decrypt_field(cred, 'password')
if cred.inputs.get('cloud_environment', None):
env['AZURE_CLOUD_ENVIRONMENT'] = cred.inputs['cloud_environment']
env['AZURE_SUBSCRIPTION_ID'] = cred.get_input('subscription', default='')
env['AZURE_AD_USER'] = cred.get_input('username', default='')
env['AZURE_PASSWORD'] = cred.get_input('password', default='')
if cred.has_input('cloud_environment'):
env['AZURE_CLOUD_ENVIRONMENT'] = cred.get_input('cloud_environment')
def vmware(cred, env, private_data_dir):
env['VMWARE_USER'] = cred.username
env['VMWARE_PASSWORD'] = decrypt_field(cred, 'password')
env['VMWARE_HOST'] = cred.host
env['VMWARE_USER'] = cred.get_input('username', default='')
env['VMWARE_PASSWORD'] = cred.get_input('password', default='')
env['VMWARE_HOST'] = cred.get_input('host', default='')
env['VMWARE_VALIDATE_CERTS'] = str(settings.VMWARE_VALIDATE_CERTS)

View File

@@ -27,7 +27,7 @@ __all__ = ['JobEvent', 'ProjectUpdateEvent', 'AdHocCommandEvent',
def sanitize_event_keys(kwargs, valid_keys):
# Sanity check: Don't honor keys that we don't recognize.
for key in kwargs.keys():
for key in list(kwargs.keys()):
if key not in valid_keys:
kwargs.pop(key)
@@ -424,7 +424,7 @@ class JobEvent(BasePlaybookEvent):
def get_absolute_url(self, request=None):
return reverse('api:job_event_detail', kwargs={'pk': self.pk}, request=request)
def __unicode__(self):
def __str__(self):
return u'%s @ %s' % (self.get_event_display2(), self.created.isoformat())
def _update_from_event_data(self):
@@ -580,7 +580,7 @@ class BaseCommandEvent(CreatedModifiedModel):
editable=False,
)
def __unicode__(self):
def __str__(self):
return u'%s @ %s' % (self.get_event_display(), self.created.isoformat())
@classmethod

View File

@@ -1,11 +1,9 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
import six
import random
from decimal import Decimal
from django.core.exceptions import ValidationError
from django.core.validators import MinValueValidator
from django.db import models, connection
from django.db.models.signals import post_save, post_delete
@@ -31,15 +29,6 @@ from awx.main.models.mixins import RelatedJobsMixin
__all__ = ('Instance', 'InstanceGroup', 'JobOrigin', 'TowerScheduleState',)
def validate_queuename(v):
# kombu doesn't play nice with unicode in queue names
if v:
try:
'{}'.format(v.decode('utf-8'))
except UnicodeEncodeError:
raise ValidationError(_(six.text_type('{} contains unsupported characters')).format(v))
class HasPolicyEditsMixin(HasEditsMixin):
class Meta:
@@ -164,7 +153,6 @@ class Instance(HasPolicyEditsMixin, BaseModel):
'memory', 'cpu_capacity', 'mem_capacity'])
def clean_hostname(self):
validate_queuename(self.hostname)
return self.hostname
@@ -235,7 +223,6 @@ class InstanceGroup(HasPolicyEditsMixin, BaseModel, RelatedJobsMixin):
app_label = 'main'
def clean_name(self):
validate_queuename(self.name)
return self.name
def fit_task_to_most_remaining_capacity_instance(self, task):

View File

@@ -3,12 +3,14 @@
# Python
import datetime
import time
import itertools
import logging
import re
import copy
from urlparse import urljoin
import os.path
import six
from urllib.parse import urljoin
# Django
from django.conf import settings
@@ -342,9 +344,13 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
host_updates = hosts_to_update.setdefault(host_pk, {})
host_updates['has_inventory_sources'] = False
# Now apply updates to hosts where needed (in batches).
all_update_pks = hosts_to_update.keys()
for offset in xrange(0, len(all_update_pks), 500):
update_pks = all_update_pks[offset:(offset + 500)]
all_update_pks = list(hosts_to_update.keys())
def _chunk(items, chunk_size):
for i, group in itertools.groupby(enumerate(items), lambda x: x[0] // chunk_size):
yield (g[1] for g in group)
for update_pks in _chunk(all_update_pks, 500):
for host in hosts_qs.filter(pk__in=update_pks):
host_updates = hosts_to_update[host.pk]
for field, value in host_updates.items():
@@ -411,12 +417,12 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
failed_group_pks.add(group_pk)
# Now apply updates to each group as needed (in batches).
all_update_pks = groups_to_update.keys()
for offset in xrange(0, len(all_update_pks), 500):
all_update_pks = list(groups_to_update.keys())
for offset in range(0, len(all_update_pks), 500):
update_pks = all_update_pks[offset:(offset + 500)]
for group in self.groups.filter(pk__in=update_pks):
group_updates = groups_to_update[group.pk]
for field, value in group_updates.items():
for field, value in list(group_updates.items()):
if getattr(group, field) != value:
setattr(group, field, value)
else:
@@ -428,7 +434,8 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
'''
Update model fields that are computed from database relationships.
'''
logger.debug("Going to update inventory computed fields")
logger.debug("Going to update inventory computed fields, pk={0}".format(self.pk))
start_time = time.time()
if update_hosts:
self.update_host_computed_fields()
if update_groups:
@@ -456,7 +463,7 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
}
# CentOS python seems to have issues clobbering the inventory on poor timing during certain operations
iobj = Inventory.objects.get(id=self.id)
for field, value in computed_fields.items():
for field, value in list(computed_fields.items()):
if getattr(iobj, field) != value:
setattr(iobj, field, value)
# update in-memory object
@@ -465,7 +472,8 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
computed_fields.pop(field)
if computed_fields:
iobj.save(update_fields=computed_fields.keys())
logger.debug("Finished updating inventory computed fields")
logger.debug("Finished updating inventory computed fields, pk={0}, in "
"{1:.3f} seconds".format(self.pk, time.time() - start_time))
def websocket_emit_status(self, status):
connection.on_commit(lambda: emit_channel_notification(
@@ -1733,6 +1741,14 @@ class InventoryUpdate(UnifiedJob, InventorySourceOptions, JobNotificationMixin,
return self.global_instance_groups
return selected_groups
@property
def ansible_virtualenv_path(self):
if self.inventory_source and self.inventory_source.inventory:
organization = self.inventory_source.inventory.organization
if organization and organization.custom_virtualenv:
return organization.custom_virtualenv
return settings.ANSIBLE_VENV_PATH
def cancel(self, job_explanation=None, is_chain=False):
res = super(InventoryUpdate, self).cancel(job_explanation=job_explanation, is_chain=is_chain)
if res:

View File

@@ -8,7 +8,7 @@ import logging
import os
import time
import json
from urlparse import urljoin
from urllib.parse import urljoin
import six
@@ -347,8 +347,8 @@ class JobTemplate(UnifiedJobTemplate, JobOptions, SurveyJobTemplateMixin, Resour
except JobLaunchConfig.DoesNotExist:
wj_config = JobLaunchConfig()
actual_inventory = wj_config.inventory if wj_config.inventory else self.inventory
for idx in xrange(min(self.job_slice_count,
actual_inventory.hosts.count())):
for idx in range(min(self.job_slice_count,
actual_inventory.hosts.count())):
create_kwargs = dict(workflow_job=job,
unified_job_template=self,
ancestor_artifacts=dict(job_slice=idx + 1))
@@ -695,7 +695,7 @@ class Job(UnifiedJob, JobOptions, SurveyJobMixin, JobNotificationMixin, TaskMana
count_hosts = Host.objects.filter(inventory__jobs__pk=self.pk).count()
if self.job_slice_count > 1:
# Integer division intentional
count_hosts = (count_hosts + self.job_slice_count - self.job_slice_number) / self.job_slice_count
count_hosts = (count_hosts + self.job_slice_count - self.job_slice_number) // self.job_slice_count
return min(count_hosts, 5 if self.forks == 0 else self.forks) + 1
@property
@@ -1120,7 +1120,7 @@ class JobHostSummary(CreatedModifiedModel):
skipped = models.PositiveIntegerField(default=0, editable=False)
failed = models.BooleanField(default=False, editable=False)
def __unicode__(self):
def __str__(self):
host = getattr_dne(self, 'host')
hostname = host.name if host else 'N/A'
return '%s changed=%d dark=%d failures=%d ok=%d processed=%d skipped=%s' % \

View File

@@ -20,6 +20,7 @@ from awx.main.notifications.pagerduty_backend import PagerDutyBackend
from awx.main.notifications.hipchat_backend import HipChatBackend
from awx.main.notifications.webhook_backend import WebhookBackend
from awx.main.notifications.mattermost_backend import MattermostBackend
from awx.main.notifications.grafana_backend import GrafanaBackend
from awx.main.notifications.rocketchat_backend import RocketChatBackend
from awx.main.notifications.irc_backend import IrcBackend
from awx.main.fields import JSONField
@@ -36,6 +37,7 @@ class NotificationTemplate(CommonModelNameNotUnique):
('slack', _('Slack'), SlackBackend),
('twilio', _('Twilio'), TwilioBackend),
('pagerduty', _('Pagerduty'), PagerDutyBackend),
('grafana', _('Grafana'), GrafanaBackend),
('hipchat', _('HipChat'), HipChatBackend),
('webhook', _('Webhook'), WebhookBackend),
('mattermost', _('Mattermost'), MattermostBackend),
@@ -82,7 +84,7 @@ class NotificationTemplate(CommonModelNameNotUnique):
setattr(self, '_saved_{}_{}'.format("config", field), value)
self.notification_configuration[field] = ''
else:
encrypted = encrypt_field(self, 'notification_configuration', subfield=field, skip_utf8=True)
encrypted = encrypt_field(self, 'notification_configuration', subfield=field)
self.notification_configuration[field] = encrypted
if 'notification_configuration' not in update_fields:
update_fields.append('notification_configuration')

View File

@@ -4,7 +4,7 @@
# Python
import datetime
import os
import urlparse
import urllib.parse as urlparse
# Django
from django.conf import settings
@@ -68,7 +68,7 @@ class ProjectOptions(models.Model):
@classmethod
def get_local_path_choices(cls):
if os.path.exists(settings.PROJECTS_ROOT):
paths = [x.decode('utf-8') for x in os.listdir(settings.PROJECTS_ROOT)
paths = [x for x in os.listdir(settings.PROJECTS_ROOT)
if (os.path.isdir(os.path.join(settings.PROJECTS_ROOT, x)) and
not x.startswith('.') and not x.startswith('_'))]
qs = Project.objects
@@ -166,8 +166,8 @@ class ProjectOptions(models.Model):
check_special_cases=False)
scm_url_parts = urlparse.urlsplit(scm_url)
# Prefer the username/password in the URL, if provided.
scm_username = scm_url_parts.username or cred.username or ''
if scm_url_parts.password or cred.password:
scm_username = scm_url_parts.username or cred.get_input('username', default='')
if scm_url_parts.password or cred.has_input('password'):
scm_password = '********'
else:
scm_password = ''
@@ -475,6 +475,21 @@ class ProjectUpdate(UnifiedJob, ProjectOptions, JobNotificationMixin, TaskManage
def _get_parent_field_name(self):
return 'project'
def _update_parent_instance(self):
if not self.project:
return # no parent instance to update
if self.job_type == PERM_INVENTORY_DEPLOY:
# Do not update project status if this is sync job
# unless no other updates have happened or started
first_update = False
if self.project.status == 'never updated' and self.status == 'running':
first_update = True
elif self.project.current_job == self:
first_update = True
if not first_update:
return
return super(ProjectUpdate, self)._update_parent_instance()
@classmethod
def _get_task_class(cls):
from awx.main.tasks import RunProjectUpdate

View File

@@ -155,7 +155,7 @@ class Role(models.Model):
object_id = models.PositiveIntegerField(null=True, default=None)
content_object = GenericForeignKey('content_type', 'object_id')
def __unicode__(self):
def __str__(self):
if 'role_field' in self.__dict__:
return u'%s-%s' % (self.name, self.pk)
else:
@@ -315,7 +315,7 @@ class Role(models.Model):
# minus 4k of padding for the other parts of the query, leads us
# to the magic number of 41496, or 40000 for a nice round number
def split_ids_for_sqlite(role_ids):
for i in xrange(0, len(role_ids), 40000):
for i in range(0, len(role_ids), 40000):
yield role_ids[i:i + 40000]

View File

@@ -209,7 +209,7 @@ class Schedule(CommonModel, LaunchTimeConfig):
pass
return x
def __unicode__(self):
def __str__(self):
return u'%s_t%s_%s_%s' % (self.name, self.unified_job_template.id, self.id, self.next_run)
def get_absolute_url(self, request=None):

View File

@@ -2,7 +2,8 @@
# All Rights Reserved.
# Python
from StringIO import StringIO
from io import StringIO
import codecs
import json
import logging
import os
@@ -353,7 +354,8 @@ class UnifiedJobTemplate(PolymorphicModel, CommonModelNameNotUnique, Notificatio
logger.warn(six.text_type('Fields {} are not allowed as overrides to spawn from {}.').format(
six.text_type(', ').join(unallowed_fields), self
))
map(validated_kwargs.pop, unallowed_fields)
for f in unallowed_fields:
validated_kwargs.pop(f)
unified_job = copy_model_by_class(self, unified_job_class, fields, validated_kwargs)
@@ -697,9 +699,9 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
)
def get_absolute_url(self, request=None):
real_instance = self.get_real_instance()
if real_instance != self:
return real_instance.get_absolute_url(request=request)
RealClass = self.get_real_instance_class()
if RealClass != UnifiedJob:
return RealClass.get_absolute_url(RealClass(pk=self.pk), request=request)
else:
return ''
@@ -735,7 +737,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
def _resources_sufficient_for_launch(self):
return True
def __unicode__(self):
def __str__(self):
return u'%s-%s-%s' % (self.created, self.id, self.status)
@property
@@ -900,7 +902,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
parent = getattr(self, self._get_parent_field_name())
if parent is None:
return
valid_fields = parent.get_ask_mapping().keys()
valid_fields = list(parent.get_ask_mapping().keys())
# Special cases allowed for workflows
if hasattr(self, 'extra_vars'):
valid_fields.extend(['survey_passwords', 'extra_vars'])
@@ -991,9 +993,11 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
if not os.path.exists(settings.JOBOUTPUT_ROOT):
os.makedirs(settings.JOBOUTPUT_ROOT)
fd = tempfile.NamedTemporaryFile(
mode='w',
prefix='{}-{}-'.format(self.model_to_str(), self.pk),
suffix='.out',
dir=settings.JOBOUTPUT_ROOT
dir=settings.JOBOUTPUT_ROOT,
encoding='utf-8'
)
# Before the addition of event-based stdout, older versions of
@@ -1008,7 +1012,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
fd.write(legacy_stdout_text)
if hasattr(fd, 'name'):
fd.flush()
return open(fd.name, 'r')
return codecs.open(fd.name, 'r', encoding='utf-8')
else:
# we just wrote to this StringIO, so rewind it
fd.seek(0)
@@ -1030,10 +1034,16 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
# don't bother actually fetching the data
total = self.get_event_queryset().aggregate(
total=models.Sum(models.Func(models.F('stdout'), function='LENGTH'))
)['total']
)['total'] or 0
if total > max_supported:
raise StdoutMaxBytesExceeded(total, max_supported)
# psycopg2's copy_expert writes bytes, but callers of this
# function assume a str-based fd will be returned; decode
# .write() calls on the fly to maintain this interface
_write = fd.write
fd.write = lambda s: _write(smart_text(s))
cursor.copy_expert(
"copy (select stdout from {} where {}={} order by start_line) to stdout".format(
self._meta.db_table + 'event',
@@ -1048,7 +1058,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
# up escaped line sequences
fd.flush()
subprocess.Popen("sed -i 's/\\\\r\\\\n/\\n/g' {}".format(fd.name), shell=True).wait()
return open(fd.name, 'r')
return codecs.open(fd.name, 'r', encoding='utf-8')
else:
# If we're dealing with an in-memory string buffer, use
# string.replace()
@@ -1063,7 +1073,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
return content
def _result_stdout_raw(self, redact_sensitive=False, escape_ascii=False):
content = self.result_stdout_raw_handle().read().decode('utf-8')
content = self.result_stdout_raw_handle().read()
if redact_sensitive:
content = UriCleaner.remove_sensitive(content)
if escape_ascii:
@@ -1096,7 +1106,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
else:
end_actual = len(stdout_lines)
return_buffer = return_buffer.getvalue().decode('utf-8')
return_buffer = return_buffer.getvalue()
if redact_sensitive:
return_buffer = UriCleaner.remove_sensitive(return_buffer)
if escape_ascii:
@@ -1314,7 +1324,8 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
def cancel(self, job_explanation=None, is_chain=False):
if self.can_cancel:
if not is_chain:
map(lambda x: x.cancel(job_explanation=self._build_job_explanation(), is_chain=True), self.get_jobs_fail_chain())
for x in self.get_jobs_fail_chain():
x.cancel(job_explanation=self._build_job_explanation(), is_chain=True)
if not self.cancel_flag:
self.cancel_flag = True

View File

@@ -2,7 +2,6 @@
# All Rights Reserved.
# Python
#import urlparse
import logging
# Django
@@ -37,7 +36,7 @@ from awx.main.redact import REPLACE_STR
from awx.main.fields import JSONField
from copy import copy
from urlparse import urljoin
from urllib.parse import urljoin
__all__ = ['WorkflowJobTemplate', 'WorkflowJob', 'WorkflowJobOptions', 'WorkflowJobNode', 'WorkflowJobTemplateNode',]

View File

@@ -0,0 +1,66 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved.
import datetime
import logging
import requests
import dateutil.parser as dp
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
logger = logging.getLogger('awx.main.notifications.grafana_backend')
class GrafanaBackend(AWXBaseEmailBackend):
init_parameters = {"grafana_url": {"label": "Grafana URL", "type": "string"},
"grafana_key": {"label": "Grafana API Key", "type": "password"}}
recipient_parameter = "grafana_url"
sender_parameter = None
def __init__(self, grafana_key,dashboardId=None, panelId=None, annotation_tags=None, grafana_no_verify_ssl=False, isRegion=True,
fail_silently=False, **kwargs):
super(GrafanaBackend, self).__init__(fail_silently=fail_silently)
self.grafana_key = grafana_key
self.dashboardId = dashboardId
self.panelId = panelId
self.annotation_tags = annotation_tags if annotation_tags is not None else []
self.grafana_no_verify_ssl = grafana_no_verify_ssl
self.isRegion = isRegion
def format_body(self, body):
return body
def send_messages(self, messages):
sent_messages = 0
for m in messages:
grafana_data = {}
grafana_headers = {}
try:
epoch=datetime.datetime.utcfromtimestamp(0)
grafana_data['time'] = int((dp.parse(m.body['started']).replace(tzinfo=None) - epoch).total_seconds() * 1000)
grafana_data['timeEnd'] = int((dp.parse(m.body['finished']).replace(tzinfo=None) - epoch).total_seconds() * 1000)
except ValueError:
logger.error(smart_text(_("Error converting time {} or timeEnd {} to int.").format(m.body['started'],m.body['finished'])))
if not self.fail_silently:
raise Exception(smart_text(_("Error converting time {} and/or timeEnd {} to int.").format(m.body['started'],m.body['finished'])))
grafana_data['isRegion'] = self.isRegion
grafana_data['dashboardId'] = self.dashboardId
grafana_data['panelId'] = self.panelId
grafana_data['tags'] = self.annotation_tags
grafana_data['text'] = m.subject
grafana_headers['Authorization'] = "Bearer {}".format(self.grafana_key)
grafana_headers['Content-Type'] = "application/json"
r = requests.post("{}/api/annotations".format(m.recipients()[0]),
json=grafana_data,
headers=grafana_headers,
verify=(not self.grafana_no_verify_ssl))
if r.status_code >= 400:
logger.error(smart_text(_("Error sending notification grafana: {}").format(r.text)))
if not self.fail_silently:
raise Exception(smart_text(_("Error sending notification grafana: {}").format(r.text)))
sent_messages += 1
return sent_messages

View File

@@ -35,7 +35,7 @@ class MattermostBackend(AWXBaseEmailBackend):
for m in messages:
payload = {}
for opt, optval in {'mattermost_icon_url':'icon_url',
'mattermost_channel': 'channel', 'mattermost_username': 'username'}.iteritems():
'mattermost_channel': 'channel', 'mattermost_username': 'username'}.items():
optvalue = getattr(self, opt)
if optvalue is not None:
payload[optval] = optvalue.strip()

View File

@@ -33,7 +33,7 @@ class RocketChatBackend(AWXBaseEmailBackend):
for m in messages:
payload = {"text": m.subject}
for opt, optval in {'rocketchat_icon_url': 'icon_url',
'rocketchat_username': 'username'}.iteritems():
'rocketchat_username': 'username'}.items():
optvalue = getattr(self, opt)
if optvalue is not None:
payload[optval] = optvalue.strip()

View File

@@ -6,8 +6,6 @@ import json
import logging
import os
from six.moves import xrange
# Django
from django.conf import settings
@@ -51,7 +49,7 @@ class CallbackQueueDispatcher(object):
if not self.callback_connection or not self.connection_queue:
return
active_pid = os.getpid()
for retry_count in xrange(4):
for retry_count in range(4):
try:
if not hasattr(self, 'connection_pid'):
self.connection_pid = active_pid

View File

@@ -1,12 +1,12 @@
import re
import urlparse
import urllib.parse as urlparse
REPLACE_STR = '$encrypted$'
class UriCleaner(object):
REPLACE_STR = REPLACE_STR
SENSITIVE_URI_PATTERN = re.compile(ur'(\w+:(\/?\/?)[^\s]+)', re.MULTILINE) # NOQA
SENSITIVE_URI_PATTERN = re.compile(r'(\w+:(\/?\/?)[^\s]+)', re.MULTILINE) # NOQA
@staticmethod
def remove_sensitive(cleartext):

View File

@@ -152,8 +152,8 @@ class SimpleDAG(object):
return self._get_dependencies_by_label(this_ord, label)
else:
nodes = []
map(lambda l: nodes.extend(self._get_dependencies_by_label(this_ord, l)),
self.node_from_edges_by_label.keys())
for l in self.node_from_edges_by_label.keys():
nodes.extend(self._get_dependencies_by_label(this_ord, l))
return nodes
def _get_dependents_by_label(self, node_index, label):
@@ -168,8 +168,8 @@ class SimpleDAG(object):
return self._get_dependents_by_label(this_ord, label)
else:
nodes = []
map(lambda l: nodes.extend(self._get_dependents_by_label(this_ord, l)),
self.node_to_edges_by_label.keys())
for l in self.node_to_edges_by_label.keys():
nodes.extend(self._get_dependents_by_label(this_ord, l))
return nodes
def get_root_nodes(self):
@@ -189,7 +189,7 @@ class SimpleDAG(object):
node_obj = stack.pop()
children = [node['node_object'] for node in self.get_dependencies(node_obj)]
children_to_add = filter(lambda node_obj: node_obj not in node_objs_visited, children)
children_to_add = list(filter(lambda node_obj: node_obj not in node_objs_visited, children))
if children_to_add:
if node_obj in path:

View File

@@ -145,4 +145,5 @@ class DependencyGraph(object):
self.mark_inventory_update(job.inventory_id)
def add_jobs(self, jobs):
map(lambda j: self.add_job(j), jobs)
for j in jobs:
self.add_job(j)

View File

@@ -8,7 +8,6 @@ import uuid
import json
import six
import random
from sets import Set
# Django
from django.db import transaction, connection
@@ -77,14 +76,14 @@ class TaskManager():
def get_latest_project_update_tasks(self, all_sorted_tasks):
project_ids = Set()
project_ids = set()
for task in all_sorted_tasks:
if isinstance(task, Job):
project_ids.add(task.project_id)
return ProjectUpdate.objects.filter(id__in=project_ids)
def get_latest_inventory_update_tasks(self, all_sorted_tasks):
inventory_ids = Set()
inventory_ids = set()
for task in all_sorted_tasks:
if isinstance(task, Job):
inventory_ids.add(task.inventory_id)
@@ -96,7 +95,7 @@ class TaskManager():
return graph_workflow_jobs
def get_inventory_source_tasks(self, all_sorted_tasks):
inventory_ids = Set()
inventory_ids = set()
for task in all_sorted_tasks:
if isinstance(task, Job):
inventory_ids.add(task.inventory_id)
@@ -174,7 +173,8 @@ class TaskManager():
status_changed = True
else:
workflow_nodes = dag.mark_dnr_nodes()
map(lambda n: n.save(update_fields=['do_not_run']), workflow_nodes)
for n in workflow_nodes:
n.save(update_fields=['do_not_run'])
is_done = dag.is_workflow_done()
if not is_done:
continue
@@ -284,7 +284,9 @@ class TaskManager():
connection.on_commit(post_commit)
def process_running_tasks(self, running_tasks):
map(lambda task: self.graph[task.instance_group.name]['graph'].add_job(task) if task.instance_group else None, running_tasks)
for task in running_tasks:
if task.instance_group:
self.graph[task.instance_group.name]['graph'].add_job(task)
def create_project_update(self, task):
project_task = Project.objects.get(id=task.project_id).create_project_update(
@@ -323,7 +325,7 @@ class TaskManager():
for dep in dependencies:
# Add task + all deps except self
dep.dependent_jobs.add(*([task] + filter(lambda d: d != dep, dependencies)))
dep.dependent_jobs.add(*([task] + [d for d in dependencies if d != dep]))
def get_latest_inventory_update(self, inventory_source):
latest_inventory_update = InventoryUpdate.objects.filter(inventory_source=inventory_source).order_by("-created")
@@ -456,7 +458,7 @@ class TaskManager():
task.log_format, rampart_group.name, execution_instance.hostname))
if execution_instance:
self.graph[rampart_group.name]['graph'].add_job(task)
tasks_to_fail = filter(lambda t: t != task, dependency_tasks)
tasks_to_fail = [t for t in dependency_tasks if t != task]
tasks_to_fail += [dependent_task]
self.start_task(task, rampart_group, tasks_to_fail, execution_instance)
found_acceptable_queue = True
@@ -534,13 +536,13 @@ class TaskManager():
return (self.graph[instance_group]['capacity_total'] - self.graph[instance_group]['consumed_capacity'])
def process_tasks(self, all_sorted_tasks):
running_tasks = filter(lambda t: t.status in ['waiting', 'running'], all_sorted_tasks)
running_tasks = [t for t in all_sorted_tasks if t.status in ['waiting', 'running']]
self.calculate_capacity_consumed(running_tasks)
self.process_running_tasks(running_tasks)
pending_tasks = filter(lambda t: t.status in 'pending', all_sorted_tasks)
pending_tasks = [t for t in all_sorted_tasks if t.status == 'pending']
self.process_pending_tasks(pending_tasks)
def _schedule(self):

View File

@@ -20,9 +20,9 @@ from django.db.models.signals import (
)
from django.dispatch import receiver
from django.contrib.auth import SESSION_KEY
from django.contrib.contenttypes.models import ContentType
from django.contrib.sessions.models import Session
from django.utils import timezone
from django.utils.translation import ugettext_lazy as _
# Django-CRUM
from crum import get_current_request, get_current_user
@@ -32,7 +32,6 @@ import six
# AWX
from awx.main.models import * # noqa
from awx.api.serializers import * # noqa
from awx.main.constants import CENSOR_VALUE
from awx.main.utils import model_instance_diff, model_to_dict, camelcase_to_underscore, get_current_apps
from awx.main.utils import ignore_inventory_computed_fields, ignore_inventory_group_removal, _inventory_updates
@@ -80,23 +79,28 @@ def emit_event_detail(serializer, relation, **kwargs):
def emit_job_event_detail(sender, **kwargs):
emit_event_detail(JobEventWebSocketSerializer, 'job_id', **kwargs)
from awx.api import serializers
emit_event_detail(serializers.JobEventWebSocketSerializer, 'job_id', **kwargs)
def emit_ad_hoc_command_event_detail(sender, **kwargs):
emit_event_detail(AdHocCommandEventWebSocketSerializer, 'ad_hoc_command_id', **kwargs)
from awx.api import serializers
emit_event_detail(serializers.AdHocCommandEventWebSocketSerializer, 'ad_hoc_command_id', **kwargs)
def emit_project_update_event_detail(sender, **kwargs):
emit_event_detail(ProjectUpdateEventWebSocketSerializer, 'project_update_id', **kwargs)
from awx.api import serializers
emit_event_detail(serializers.ProjectUpdateEventWebSocketSerializer, 'project_update_id', **kwargs)
def emit_inventory_update_event_detail(sender, **kwargs):
emit_event_detail(InventoryUpdateEventWebSocketSerializer, 'inventory_update_id', **kwargs)
from awx.api import serializers
emit_event_detail(serializers.InventoryUpdateEventWebSocketSerializer, 'inventory_update_id', **kwargs)
def emit_system_job_event_detail(sender, **kwargs):
emit_event_detail(SystemJobEventWebSocketSerializer, 'system_job_id', **kwargs)
from awx.api import serializers
emit_event_detail(serializers.SystemJobEventWebSocketSerializer, 'system_job_id', **kwargs)
def emit_update_inventory_computed_fields(sender, **kwargs):
@@ -347,7 +351,7 @@ class ActivityStreamEnabled(threading.local):
def __init__(self):
self.enabled = True
def __nonzero__(self):
def __bool__(self):
return bool(self.enabled and getattr(settings, 'ACTIVITY_STREAM_ENABLED', True))
@@ -385,31 +389,38 @@ def disable_computed_fields():
connect_computed_field_signals()
model_serializer_mapping = {
Organization: OrganizationSerializer,
Inventory: InventorySerializer,
Host: HostSerializer,
Group: GroupSerializer,
InstanceGroup: InstanceGroupSerializer,
InventorySource: InventorySourceSerializer,
CustomInventoryScript: CustomInventoryScriptSerializer,
Credential: CredentialSerializer,
Team: TeamSerializer,
Project: ProjectSerializer,
JobTemplate: JobTemplateWithSpecSerializer,
Job: JobSerializer,
AdHocCommand: AdHocCommandSerializer,
NotificationTemplate: NotificationTemplateSerializer,
Notification: NotificationSerializer,
CredentialType: CredentialTypeSerializer,
Schedule: ScheduleSerializer,
Label: LabelSerializer,
WorkflowJobTemplate: WorkflowJobTemplateWithSpecSerializer,
WorkflowJobTemplateNode: WorkflowJobTemplateNodeSerializer,
WorkflowJob: WorkflowJobSerializer,
OAuth2AccessToken: OAuth2TokenSerializer,
OAuth2Application: OAuth2ApplicationSerializer,
}
def model_serializer_mapping():
from awx.api import serializers
from awx.main import models
from awx.conf.models import Setting
from awx.conf.serializers import SettingSerializer
return {
Setting: SettingSerializer,
models.Organization: serializers.OrganizationSerializer,
models.Inventory: serializers.InventorySerializer,
models.Host: serializers.HostSerializer,
models.Group: serializers.GroupSerializer,
models.InstanceGroup: serializers.InstanceGroupSerializer,
models.InventorySource: serializers.InventorySourceSerializer,
models.CustomInventoryScript: serializers.CustomInventoryScriptSerializer,
models.Credential: serializers.CredentialSerializer,
models.Team: serializers.TeamSerializer,
models.Project: serializers.ProjectSerializer,
models.JobTemplate: serializers.JobTemplateWithSpecSerializer,
models.Job: serializers.JobSerializer,
models.AdHocCommand: serializers.AdHocCommandSerializer,
models.NotificationTemplate: serializers.NotificationTemplateSerializer,
models.Notification: serializers.NotificationSerializer,
models.CredentialType: serializers.CredentialTypeSerializer,
models.Schedule: serializers.ScheduleSerializer,
models.Label: serializers.LabelSerializer,
models.WorkflowJobTemplate: serializers.WorkflowJobTemplateWithSpecSerializer,
models.WorkflowJobTemplateNode: serializers.WorkflowJobTemplateNodeSerializer,
models.WorkflowJob: serializers.WorkflowJobSerializer,
models.OAuth2AccessToken: serializers.OAuth2TokenSerializer,
models.OAuth2Application: serializers.OAuth2ApplicationSerializer,
}
def activity_stream_create(sender, instance, created, **kwargs):
@@ -422,7 +433,7 @@ def activity_stream_create(sender, instance, created, **kwargs):
if getattr(_type, '_deferred', False):
return
object1 = camelcase_to_underscore(instance.__class__.__name__)
changes = model_to_dict(instance, model_serializer_mapping)
changes = model_to_dict(instance, model_serializer_mapping())
# Special case where Job survey password variables need to be hidden
if type(instance) == Job:
changes['credentials'] = [
@@ -461,7 +472,7 @@ def activity_stream_update(sender, instance, **kwargs):
return
new = instance
changes = model_instance_diff(old, new, model_serializer_mapping)
changes = model_instance_diff(old, new, model_serializer_mapping())
if changes is None:
return
_type = type(instance)
@@ -506,7 +517,7 @@ def activity_stream_delete(sender, instance, **kwargs):
_type = type(instance)
if getattr(_type, '_deferred', False):
return
changes.update(model_to_dict(instance, model_serializer_mapping))
changes.update(model_to_dict(instance, model_serializer_mapping()))
object1 = camelcase_to_underscore(instance.__class__.__name__)
if type(instance) == OAuth2AccessToken:
changes['token'] = CENSOR_VALUE
@@ -643,7 +654,7 @@ def save_user_session_membership(sender, **kwargs):
if len(expired):
consumers.emit_channel_notification(
'control-limit_reached_{}'.format(user.pk),
dict(group_name='control', reason=unicode(_('limit_reached')))
dict(group_name='control', reason='limit_reached')
)

View File

@@ -3,9 +3,9 @@
# Python
from collections import OrderedDict, namedtuple
import ConfigParser
import cStringIO
import configparser
import errno
import fnmatch
import functools
import importlib
import json
@@ -18,7 +18,6 @@ import stat
import tempfile
import time
import traceback
import urlparse
from distutils.version import LooseVersion as Version
import yaml
import fcntl
@@ -26,6 +25,8 @@ try:
import psutil
except Exception:
psutil = None
from io import StringIO
import urllib.parse as urlparse
# Django
from django.conf import settings
@@ -53,7 +54,7 @@ from awx.main.queue import CallbackQueueDispatcher
from awx.main.expect import run, isolated_manager
from awx.main.dispatch.publish import task
from awx.main.dispatch import get_local_queuename, reaper
from awx.main.utils import (get_ansible_version, get_ssh_version, decrypt_field, update_scm_url,
from awx.main.utils import (get_ansible_version, get_ssh_version, update_scm_url,
check_proot_installed, build_proot_temp_dir, get_licenser,
wrap_args_with_proot, OutputEventFilter, OutputVerboseFilter, ignore_inventory_computed_fields,
ignore_inventory_group_removal, extract_ansible_vars, schedule_task_manager)
@@ -185,9 +186,9 @@ def apply_cluster_membership_policies():
actual_instances = [Node(obj=i, groups=[]) for i in considered_instances if i.managed_by_policy]
logger.info("Total non-isolated instances:{} available for policy: {}".format(
total_instances, len(actual_instances)))
for g in sorted(actual_groups, cmp=lambda x,y: len(x.instances) - len(y.instances)):
for g in sorted(actual_groups, key=lambda x: len(x.instances)):
policy_min_added = []
for i in sorted(actual_instances, cmp=lambda x,y: len(x.groups) - len(y.groups)):
for i in sorted(actual_instances, key=lambda x: len(x.groups)):
if len(g.instances) >= g.obj.policy_instance_minimum:
break
if i.obj.id in g.instances:
@@ -201,9 +202,9 @@ def apply_cluster_membership_policies():
logger.info(six.text_type("Policy minimum, adding Instances {} to Group {}").format(policy_min_added, g.obj.name))
# Finally, process instance policy percentages
for g in sorted(actual_groups, cmp=lambda x,y: len(x.instances) - len(y.instances)):
for g in sorted(actual_groups, key=lambda x: len(x.instances)):
policy_per_added = []
for i in sorted(actual_instances, cmp=lambda x,y: len(x.groups) - len(y.groups)):
for i in sorted(actual_instances, key=lambda x: len(x.groups)):
if i.obj.id in g.instances:
# If the instance is already _in_ the group, it was
# applied earlier via a minimum policy or policy list
@@ -294,7 +295,7 @@ def send_notifications(notification_list, job_id=None):
finally:
try:
notification.save(update_fields=update_fields)
except Exception as e:
except Exception:
logger.exception(six.text_type('Error saving notification {} result.').format(notification.id))
@@ -688,6 +689,13 @@ class BaseTask(object):
'''
return os.path.abspath(os.path.join(os.path.dirname(__file__), *args))
def get_path_to_ansible(self, instance, executable='ansible-playbook', **kwargs):
venv_path = getattr(instance, 'ansible_virtualenv_path', settings.ANSIBLE_VENV_PATH)
venv_exe = os.path.join(venv_path, 'bin', executable)
if os.path.exists(venv_exe):
return venv_exe
return shutil.which(executable)
def build_private_data(self, job, **kwargs):
'''
Return SSH private key data (only if stored in DB as ssh_key_data).
@@ -722,12 +730,12 @@ class BaseTask(object):
ssh_ver = get_ssh_version()
ssh_too_old = True if ssh_ver == "unknown" else Version(ssh_ver) < Version("6.0")
openssh_keys_supported = ssh_ver != "unknown" and Version(ssh_ver) >= Version("6.5")
for credential, data in private_data.get('credentials', {}).iteritems():
for credential, data in private_data.get('credentials', {}).items():
# Bail out now if a private key was provided in OpenSSH format
# and we're running an earlier version (<6.5).
if 'OPENSSH PRIVATE KEY' in data and not openssh_keys_supported:
raise RuntimeError(OPENSSH_KEY_ERROR)
for credential, data in private_data.get('credentials', {}).iteritems():
for credential, data in private_data.get('credentials', {}).items():
# OpenSSH formatted keys must have a trailing newline to be
# accepted by ssh-add.
if 'OPENSSH PRIVATE KEY' in data and not data.endswith('\n'):
@@ -782,8 +790,11 @@ class BaseTask(object):
'a valid Python virtualenv does not exist at {}'.format(venv_path)
)
env.pop('PYTHONPATH', None) # default to none if no python_ver matches
if os.path.isdir(os.path.join(venv_libdir, "python2.7")):
env['PYTHONPATH'] = os.path.join(venv_libdir, "python2.7", "site-packages") + ":"
for version in os.listdir(venv_libdir):
if fnmatch.fnmatch(version, 'python[23].*'):
if os.path.isdir(os.path.join(venv_libdir, version)):
env['PYTHONPATH'] = os.path.join(venv_libdir, version, "site-packages") + ":"
break
# Add awx/lib to PYTHONPATH.
if add_awx_lib:
env['PYTHONPATH'] = env.get('PYTHONPATH', '') + self.get_path_to('..', 'lib') + ':'
@@ -831,7 +842,7 @@ class BaseTask(object):
json_data = json.dumps(script_data)
handle, path = tempfile.mkstemp(dir=kwargs.get('private_data_dir', None))
f = os.fdopen(handle, 'w')
f.write('#! /usr/bin/env python\n# -*- coding: utf-8 -*-\nprint %r\n' % json_data)
f.write('#! /usr/bin/env python\n# -*- coding: utf-8 -*-\nprint(%r)\n' % json_data)
f.close()
os.chmod(path, stat.S_IRUSR | stat.S_IXUSR | stat.S_IWUSR)
return path
@@ -882,7 +893,7 @@ class BaseTask(object):
if 'uuid' in event_data:
cache_event = cache.get('ev-{}'.format(event_data['uuid']), None)
if cache_event is not None:
event_data.update(cache_event)
event_data.update(json.loads(cache_event))
dispatcher.dispatch(event_data)
return OutputEventFilter(event_callback)
@@ -1113,16 +1124,16 @@ class RunJob(BaseTask):
for credential in job.credentials.all():
# If we were sent SSH credentials, decrypt them and send them
# back (they will be written to a temporary file).
if credential.ssh_key_data not in (None, ''):
private_data['credentials'][credential] = decrypt_field(credential, 'ssh_key_data') or ''
if credential.has_input('ssh_key_data'):
private_data['credentials'][credential] = credential.get_input('ssh_key_data', default='')
if credential.kind == 'openstack':
openstack_auth = dict(auth_url=credential.host,
username=credential.username,
password=decrypt_field(credential, "password"),
project_name=credential.project)
if credential.domain not in (None, ''):
openstack_auth['domain_name'] = credential.domain
openstack_auth = dict(auth_url=credential.get_input('host', default=''),
username=credential.get_input('username', default=''),
password=credential.get_input('password', default=''),
project_name=credential.get_input('project', default=''))
if credential.has_input('domain'):
openstack_auth['domain_name'] = credential.get_input('domain', default='')
openstack_data = {
'clouds': {
'devstack': {
@@ -1145,22 +1156,27 @@ class RunJob(BaseTask):
for field in ('ssh_key_unlock', 'ssh_password', 'become_password'):
value = kwargs.get(
field,
decrypt_field(cred, 'password' if field == 'ssh_password' else field)
cred.get_input('password' if field == 'ssh_password' else field, default='')
)
if value not in ('', 'ASK'):
passwords[field] = value
for cred in job.vault_credentials:
field = 'vault_password'
if cred.inputs.get('vault_id'):
field = 'vault_password.{}'.format(cred.inputs['vault_id'])
vault_id = cred.get_input('vault_id', default=None)
if vault_id:
field = 'vault_password.{}'.format(vault_id)
if field in passwords:
raise RuntimeError(
'multiple vault credentials were specified with --vault-id {}@prompt'.format(
cred.inputs['vault_id']
vault_id
)
)
value = kwargs.get(field, decrypt_field(cred, 'vault_password'))
value = kwargs.get(field, None)
if value is None:
value = cred.get_input('vault_password', default='')
if value not in ('', 'ASK'):
passwords[field] = value
@@ -1170,10 +1186,10 @@ class RunJob(BaseTask):
'''
if 'ssh_key_unlock' not in passwords:
for cred in job.network_credentials:
if cred.inputs.get('ssh_key_unlock'):
if cred.has_input('ssh_key_unlock'):
passwords['ssh_key_unlock'] = kwargs.get(
'ssh_key_unlock',
decrypt_field(cred, 'ssh_key_unlock')
cred.get_input('ssh_key_unlock', default='')
)
break
@@ -1229,17 +1245,17 @@ class RunJob(BaseTask):
env['OS_CLIENT_CONFIG_FILE'] = cred_files.get(cloud_cred, '')
for network_cred in job.network_credentials:
env['ANSIBLE_NET_USERNAME'] = network_cred.username
env['ANSIBLE_NET_PASSWORD'] = decrypt_field(network_cred, 'password')
env['ANSIBLE_NET_USERNAME'] = network_cred.get_input('username', default='')
env['ANSIBLE_NET_PASSWORD'] = network_cred.get_input('password', default='')
ssh_keyfile = cred_files.get(network_cred, '')
if ssh_keyfile:
env['ANSIBLE_NET_SSH_KEYFILE'] = ssh_keyfile
authorize = network_cred.authorize
authorize = network_cred.get_input('authorize', default=False)
env['ANSIBLE_NET_AUTHORIZE'] = six.text_type(int(authorize))
if authorize:
env['ANSIBLE_NET_AUTH_PASS'] = decrypt_field(network_cred, 'authorize_password')
env['ANSIBLE_NET_AUTH_PASS'] = network_cred.get_input('authorize_password', default='')
return env
@@ -1252,9 +1268,9 @@ class RunJob(BaseTask):
ssh_username, become_username, become_method = '', '', ''
if creds:
ssh_username = kwargs.get('username', creds.username)
become_method = kwargs.get('become_method', creds.become_method)
become_username = kwargs.get('become_username', creds.become_username)
ssh_username = kwargs.get('username', creds.get_input('username', default=''))
become_method = kwargs.get('become_method', creds.get_input('become_method', default=''))
become_username = kwargs.get('become_username', creds.get_input('become_username', default=''))
else:
become_method = None
become_username = ""
@@ -1263,7 +1279,11 @@ class RunJob(BaseTask):
# it doesn't make sense to rely on ansible-playbook's default of using
# the current user.
ssh_username = ssh_username or 'root'
args = ['ansible-playbook', '-i', self.build_inventory(job, **kwargs)]
args = [
self.get_path_to_ansible(job, 'ansible-playbook', **kwargs),
'-i',
self.build_inventory(job, **kwargs)
]
if job.job_type == 'check':
args.append('--check')
args.extend(['-u', sanitize_jinja(ssh_username)])
@@ -1475,8 +1495,8 @@ class RunProjectUpdate(BaseTask):
private_data = {'credentials': {}}
if project_update.credential:
credential = project_update.credential
if credential.ssh_key_data not in (None, ''):
private_data['credentials'][credential] = decrypt_field(credential, 'ssh_key_data')
if credential.has_input('ssh_key_data'):
private_data['credentials'][credential] = credential.get_input('ssh_key_data', default='')
return private_data
def build_passwords(self, project_update, **kwargs):
@@ -1487,9 +1507,9 @@ class RunProjectUpdate(BaseTask):
passwords = super(RunProjectUpdate, self).build_passwords(project_update,
**kwargs)
if project_update.credential:
passwords['scm_key_unlock'] = decrypt_field(project_update.credential, 'ssh_key_unlock')
passwords['scm_username'] = project_update.credential.username
passwords['scm_password'] = decrypt_field(project_update.credential, 'password')
passwords['scm_key_unlock'] = project_update.credential.get_input('ssh_key_unlock', default='')
passwords['scm_username'] = project_update.credential.get_input('username', default='')
passwords['scm_password'] = project_update.credential.get_input('password', default='')
return passwords
def build_env(self, project_update, **kwargs):
@@ -1557,7 +1577,11 @@ class RunProjectUpdate(BaseTask):
Build command line argument list for running ansible-playbook,
optionally using ssh-agent for public/private key authentication.
'''
args = ['ansible-playbook', '-i', self.build_inventory(project_update, **kwargs)]
args = [
self.get_path_to_ansible(project_update, 'ansible-playbook', **kwargs),
'-i',
self.build_inventory(project_update, **kwargs)
]
if getattr(settings, 'PROJECT_UPDATE_VVV', False):
args.append('-vvv')
else:
@@ -1588,7 +1612,7 @@ class RunProjectUpdate(BaseTask):
def build_safe_args(self, project_update, **kwargs):
pwdict = dict(kwargs.get('passwords', {}).items())
for pw_name, pw_val in pwdict.items():
for pw_name, pw_val in list(pwdict.items()):
if pw_name in ('', 'yes', 'no', 'scm_username'):
continue
pwdict[pw_name] = HIDDEN_PASSWORD
@@ -1609,7 +1633,7 @@ class RunProjectUpdate(BaseTask):
scm_username = kwargs.get('passwords', {}).get('scm_username', '')
scm_password = kwargs.get('passwords', {}).get('scm_password', '')
pwdict = dict(kwargs.get('passwords', {}).items())
for pw_name, pw_val in pwdict.items():
for pw_name, pw_val in list(pwdict.items()):
if pw_name in ('', 'yes', 'no', 'scm_username'):
continue
pwdict[pw_name] = HIDDEN_PASSWORD
@@ -1809,12 +1833,13 @@ class RunInventoryUpdate(BaseTask):
credential = inventory_update.get_cloud_credential()
if inventory_update.source == 'openstack':
openstack_auth = dict(auth_url=credential.host,
username=credential.username,
password=decrypt_field(credential, "password"),
project_name=credential.project)
if credential.domain not in (None, ''):
openstack_auth['domain_name'] = credential.domain
openstack_auth = dict(auth_url=credential.get_input('host', default=''),
username=credential.get_input('username', default=''),
password=credential.get_input('password', default=''),
project_name=credential.get_input('project', default=''))
if credential.has_input('domain'):
openstack_auth['domain_name'] = credential.get_input('domain', default='')
private_state = inventory_update.source_vars_dict.get('private', True)
# Retrieve cache path from inventory update vars if available,
# otherwise create a temporary cache path only for this update.
@@ -1850,7 +1875,7 @@ class RunInventoryUpdate(BaseTask):
)
return private_data
cp = ConfigParser.ConfigParser()
cp = configparser.RawConfigParser()
# Build custom ec2.ini for ec2 inventory script to use.
if inventory_update.source == 'ec2':
section = 'ec2'
@@ -1881,18 +1906,18 @@ class RunInventoryUpdate(BaseTask):
cache_path = tempfile.mkdtemp(prefix='ec2_cache', dir=kwargs.get('private_data_dir', None))
ec2_opts['cache_path'] = cache_path
ec2_opts.setdefault('cache_max_age', '300')
for k,v in ec2_opts.items():
for k, v in ec2_opts.items():
cp.set(section, k, six.text_type(v))
# Allow custom options to vmware inventory script.
elif inventory_update.source == 'vmware':
section = 'vmware'
cp.add_section(section)
cp.set('vmware', 'cache_max_age', 0)
cp.set('vmware', 'cache_max_age', '0')
cp.set('vmware', 'validate_certs', str(settings.VMWARE_VALIDATE_CERTS))
cp.set('vmware', 'username', credential.username)
cp.set('vmware', 'password', decrypt_field(credential, 'password'))
cp.set('vmware', 'server', credential.host)
cp.set('vmware', 'username', credential.get_input('username', default=''))
cp.set('vmware', 'password', credential.get_input('password', default=''))
cp.set('vmware', 'server', credential.get_input('host', default=''))
vmware_opts = dict(inventory_update.source_vars_dict.items())
if inventory_update.instance_filters:
@@ -1900,7 +1925,7 @@ class RunInventoryUpdate(BaseTask):
if inventory_update.group_by:
vmware_opts.setdefault('groupby_patterns', inventory_update.group_by)
for k,v in vmware_opts.items():
for k, v in vmware_opts.items():
cp.set(section, k, six.text_type(v))
elif inventory_update.source == 'satellite6':
@@ -1913,9 +1938,9 @@ class RunInventoryUpdate(BaseTask):
foreman_opts = dict(inventory_update.source_vars_dict.items())
foreman_opts.setdefault('ssl_verify', 'False')
for k, v in foreman_opts.items():
if k == 'satellite6_group_patterns' and isinstance(v, basestring):
if k == 'satellite6_group_patterns' and isinstance(v, str):
group_patterns = v
elif k == 'satellite6_group_prefix' and isinstance(v, basestring):
elif k == 'satellite6_group_prefix' and isinstance(v, str):
group_prefix = v
elif k == 'satellite6_want_hostcollections' and isinstance(v, bool):
want_hostcollections = v
@@ -1923,15 +1948,15 @@ class RunInventoryUpdate(BaseTask):
cp.set(section, k, six.text_type(v))
if credential:
cp.set(section, 'url', credential.host)
cp.set(section, 'user', credential.username)
cp.set(section, 'password', decrypt_field(credential, 'password'))
cp.set(section, 'url', credential.get_input('host', default=''))
cp.set(section, 'user', credential.get_input('username', default=''))
cp.set(section, 'password', credential.get_input('password', default=''))
section = 'ansible'
cp.add_section(section)
cp.set(section, 'group_patterns', group_patterns)
cp.set(section, 'want_facts', True)
cp.set(section, 'want_hostcollections', want_hostcollections)
cp.set(section, 'want_facts', 'True')
cp.set(section, 'want_hostcollections', str(want_hostcollections))
cp.set(section, 'group_prefix', group_prefix)
section = 'cache'
@@ -1944,15 +1969,15 @@ class RunInventoryUpdate(BaseTask):
cp.add_section(section)
if credential:
cp.set(section, 'url', credential.host)
cp.set(section, 'username', credential.username)
cp.set(section, 'password', decrypt_field(credential, 'password'))
cp.set(section, 'url', credential.get_input('host', default=''))
cp.set(section, 'username', credential.get_input('username', default=''))
cp.set(section, 'password', credential.get_input('password', default=''))
cp.set(section, 'ssl_verify', "false")
cloudforms_opts = dict(inventory_update.source_vars_dict.items())
for opt in ['version', 'purge_actions', 'clean_group_keys', 'nest_tags', 'suffix', 'prefer_ipv4']:
if opt in cloudforms_opts:
cp.set(section, opt, cloudforms_opts[opt])
cp.set(section, opt, str(cloudforms_opts[opt]))
section = 'cache'
cp.add_section(section)
@@ -1978,12 +2003,12 @@ class RunInventoryUpdate(BaseTask):
)
azure_rm_opts = dict(inventory_update.source_vars_dict.items())
for k,v in azure_rm_opts.items():
for k, v in azure_rm_opts.items():
cp.set(section, k, six.text_type(v))
# Return INI content.
if cp.sections():
f = cStringIO.StringIO()
f = StringIO()
cp.write(f)
private_data['credentials'][credential] = f.getvalue()
return private_data
@@ -2002,10 +2027,10 @@ class RunInventoryUpdate(BaseTask):
credential = inventory_update.get_cloud_credential()
if credential:
for subkey in ('username', 'host', 'project', 'client', 'tenant', 'subscription'):
passwords['source_%s' % subkey] = getattr(credential, subkey)
passwords['source_%s' % subkey] = credential.get_input(subkey, default='')
for passkey in ('password', 'ssh_key_data', 'security_token', 'secret'):
k = 'source_%s' % passkey
passwords[k] = decrypt_field(credential, passkey)
passwords[k] = credential.get_input(passkey, default='')
return passwords
def build_env(self, inventory_update, **kwargs):
@@ -2054,7 +2079,7 @@ class RunInventoryUpdate(BaseTask):
# by default, the GCE inventory source caches results on disk for
# 5 minutes; disable this behavior
cp = ConfigParser.ConfigParser()
cp = configparser.ConfigParser()
cp.add_section('cache')
cp.set('cache', 'cache_max_age', '0')
handle, path = tempfile.mkstemp(dir=kwargs.get('private_data_dir', None))
@@ -2095,8 +2120,12 @@ class RunInventoryUpdate(BaseTask):
args.append('--overwrite')
if inventory_update.overwrite_vars:
args.append('--overwrite-vars')
src = inventory_update.source
# Declare the virtualenv the management command should activate
# as it calls ansible-inventory
args.extend(['--venv', inventory_update.ansible_virtualenv_path])
src = inventory_update.source
# Add several options to the shell arguments based on the
# inventory-source-specific setting in the AWX configuration.
# These settings are "per-source"; it's entirely possible that
@@ -2134,7 +2163,7 @@ class RunInventoryUpdate(BaseTask):
f = os.fdopen(handle, 'w')
if inventory_update.source_script is None:
raise RuntimeError('Inventory Script does not exist')
f.write(inventory_update.source_script.script.encode('utf-8'))
f.write(inventory_update.source_script.script)
f.close()
os.chmod(path, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
args.append(path)
@@ -2210,8 +2239,8 @@ class RunAdHocCommand(BaseTask):
# back (they will be written to a temporary file).
creds = ad_hoc_command.credential
private_data = {'credentials': {}}
if creds and creds.ssh_key_data not in (None, ''):
private_data['credentials'][creds] = decrypt_field(creds, 'ssh_key_data') or ''
if creds and creds.has_input('ssh_key_data'):
private_data['credentials'][creds] = creds.get_input('ssh_key_data', default='')
return private_data
def build_passwords(self, ad_hoc_command, **kwargs):
@@ -2224,9 +2253,9 @@ class RunAdHocCommand(BaseTask):
if creds:
for field in ('ssh_key_unlock', 'ssh_password', 'become_password'):
if field == 'ssh_password':
value = kwargs.get(field, decrypt_field(creds, 'password'))
value = kwargs.get(field, creds.get_input('password', default=''))
else:
value = kwargs.get(field, decrypt_field(creds, field))
value = kwargs.get(field, creds.get_input(field, default=''))
if value not in ('', 'ASK'):
passwords[field] = value
return passwords
@@ -2263,9 +2292,9 @@ class RunAdHocCommand(BaseTask):
creds = ad_hoc_command.credential
ssh_username, become_username, become_method = '', '', ''
if creds:
ssh_username = kwargs.get('username', creds.username)
become_method = kwargs.get('become_method', creds.become_method)
become_username = kwargs.get('become_username', creds.become_username)
ssh_username = kwargs.get('username', creds.get_input('username', default=''))
become_method = kwargs.get('become_method', creds.get_input('become_method', default=''))
become_username = kwargs.get('become_username', creds.get_input('become_username', default=''))
else:
become_method = None
become_username = ""
@@ -2274,7 +2303,11 @@ class RunAdHocCommand(BaseTask):
# it doesn't make sense to rely on ansible's default of using the
# current user.
ssh_username = ssh_username or 'root'
args = ['ansible', '-i', self.build_inventory(ad_hoc_command, **kwargs)]
args = [
self.get_path_to_ansible(ad_hoc_command, 'ansible', **kwargs),
'-i',
self.build_inventory(ad_hoc_command, **kwargs)
]
if ad_hoc_command.job_type == 'check':
args.append('--check')
args.extend(['-u', sanitize_jinja(ssh_username)])

View File

@@ -1,5 +1,4 @@
import re
from django.utils.encoding import force_unicode
from django import template
register = template.Library()
@@ -12,7 +11,6 @@ VOWEL_SOUND = re.compile(r'''[aeio]|u([aeiou]|[^n][^aeiou]|ni[^dmnl]|nil[^l])|h(
def anora(text):
# https://pypi.python.org/pypi/anora
# < 10 lines of BSD-3 code, not worth a dependency
text = force_unicode(text)
anora = 'an' if not CONSONANT_SOUND.match(text) and VOWEL_SOUND.match(text) else 'a'
return anora + ' ' + text

View File

@@ -1,7 +1,7 @@
# Python
import pytest
import mock
from unittest import mock
from contextlib import contextmanager
from awx.main.tests.factories import (

View File

@@ -7,7 +7,6 @@ from django.core.serializers.json import DjangoJSONEncoder
from django.utils.functional import Promise
from django.utils.encoding import force_text
from coreapi.compat import force_bytes
from openapi_codec.encode import generate_swagger_object
import pytest
@@ -18,6 +17,8 @@ class i18nEncoder(DjangoJSONEncoder):
def default(self, obj):
if isinstance(obj, Promise):
return force_text(obj)
if type(obj) == bytes:
return force_text(obj)
return super(i18nEncoder, self).default(obj)
@@ -91,16 +92,16 @@ class TestSwaggerGeneration():
# for a reasonable number here; if this test starts failing, raise/lower the bounds
paths = JSON['paths']
assert 250 < len(paths) < 300
assert paths['/api/'].keys() == ['get']
assert paths['/api/v2/'].keys() == ['get']
assert sorted(
assert list(paths['/api/'].keys()) == ['get']
assert list(paths['/api/v2/'].keys()) == ['get']
assert list(sorted(
paths['/api/v2/credentials/'].keys()
) == ['get', 'post']
assert sorted(
)) == ['get', 'post']
assert list(sorted(
paths['/api/v2/credentials/{id}/'].keys()
) == ['delete', 'get', 'patch', 'put']
assert paths['/api/v2/settings/'].keys() == ['get']
assert paths['/api/v2/settings/{category_slug}/'].keys() == [
)) == ['delete', 'get', 'patch', 'put']
assert list(paths['/api/v2/settings/'].keys()) == ['get']
assert list(paths['/api/v2/settings/{category_slug}/'].keys()) == [
'get', 'put', 'patch', 'delete'
]
@@ -162,9 +163,7 @@ class TestSwaggerGeneration():
@classmethod
def teardown_class(cls):
with open('swagger.json', 'w') as f:
data = force_bytes(
json.dumps(cls.JSON, cls=i18nEncoder, indent=2)
)
data = json.dumps(cls.JSON, cls=i18nEncoder, indent=2, sort_keys=True)
# replace ISO dates w/ the same value so we don't generate
# needless diffs
data = re.sub(

View File

@@ -26,7 +26,7 @@ def generate_role_objects(objects):
combined_objects = {}
for o in objects:
if type(o) is dict:
for k,v in o.iteritems():
for k, v in o.items():
if combined_objects.get(k) is not None:
raise NotUnique(k, combined_objects)
combined_objects[k] = v

View File

@@ -1,4 +1,4 @@
import mock
from unittest import mock
import pytest
from awx.api.versioning import reverse

View File

@@ -1,4 +1,4 @@
import mock # noqa
from unittest import mock # noqa
import pytest
from awx.api.versioning import reverse
@@ -38,7 +38,7 @@ def post_adhoc(post, inventory, machine_credential):
if 'credential' not in data:
data['credential'] = machine_credential.id
for k,v in data.items():
for k, v in list(data.items()):
if v is None:
del data[k]

View File

@@ -1,9 +1,11 @@
import itertools
import re
import mock # noqa
from unittest import mock # noqa
import pytest
from django.utils.encoding import smart_str
from awx.main.models import (AdHocCommand, Credential, CredentialType, Job, JobTemplate,
Inventory, InventorySource, Project,
WorkflowJobNode)
@@ -255,7 +257,7 @@ def test_credential_validation_error_with_bad_user(post, admin, version, credent
admin
)
assert response.status_code == 400
assert response.data['user'][0] == 'Incorrect type. Expected pk value, received unicode.'
assert response.data['user'][0] == 'Incorrect type. Expected pk value, received str.'
@pytest.mark.django_db
@@ -799,7 +801,7 @@ def test_field_dependencies(get, post, organization, admin, kind, extraneous):
admin
)
assert response.status_code == 400
assert re.search('cannot be set unless .+ is set.', response.content)
assert re.search('cannot be set unless .+ is set.', smart_str(response.content))
assert Credential.objects.count() == 0

Some files were not shown because too many files have changed in this diff Show More