Compare commits

...

235 Commits

Author SHA1 Message Date
Hao Liu
f0c967c1b2 Merge pull request #13645 from TheRealHaoLiu/fix-websocket
Revert "Remove trailing $ from websocket_urlpatterns to work with cus…
2023-03-02 21:36:21 -05:00
Hao Liu
2ca0b7bc01 Revert "Remove trailing $ from websocket_urlpatterns to work with custom path to fix #12241"
This reverts commit 5e28f5dca1.
2023-03-02 21:14:53 -05:00
Alex Corey
1411d11a0e Merge pull request #13506 from AlexSCorey/13422-JTTabOnCreds
Conditionally applies the job templates tab to credentials that can be on a JT
2023-03-02 13:15:48 -05:00
Alex Corey
2fe1ea94bd Conditionally applies the job templates tab to credentials that can be on a JT 2023-03-02 12:57:20 -05:00
Hao Liu
a47cfc55ab Merge pull request #13574 from tomsiewert/use-compose-plugin
Make docker-compose command configurable in Makefile
2023-03-01 15:41:33 -05:00
Hao Liu
0eb9de02f3 Merge pull request #13627 from infamousjoeg/fix-13597-webservice_id-default
Fixes #13597 webservice_id default value added
2023-03-01 15:29:53 -05:00
Lila Yasin
39ee4285ce Working on running spellcheck on everything ahead of merging the shellcheck/code check CI addition. (#13453) 2023-03-01 10:19:00 -03:00
Christian Adams
2dcda04a9e Merge pull request #13445 from stanislav-zaprudskiy/disable_instance_command
Add `disable_instance` management command
2023-02-28 15:37:38 -05:00
Christian Adams
52d46c88e4 External users should not be able to change their password (#13491)
* Azure AD users should not be able to change their password

* Multiple auth changes

Moving get_external_user function into awx.sso.common
Altering get_external_user to not look at current config, just user object values
Altering how api/conf.py detects external auth config (and making reusable function in awx.sso.common)
Altering logic in api.serializers in _update_pasword to use awx.sso.common

* Adding unit tests

---------

Co-authored-by: John Westcott IV <john.westcott.iv@redhat.com>
2023-02-28 15:44:34 -03:00
Hao Liu
c2df22e0f0 Merge pull request #13632 from TheRealHaoLiu/reshaving-the-yak
[chore] update project_update playbook to be compliant with ansible-lint
2023-02-28 13:17:45 -05:00
Hao Liu
cf21eab7f4 [chore] update project_update playbook to be compliant with ansible-lint
reshaving the yak

Co-Authored-By: Gabriel Muniz <gmuniz@redhat.com>
2023-02-27 18:32:10 -05:00
Joe Garcia
98b2f51c18 fix kwargs[] to kwargs.get() 2023-02-27 11:52:44 -05:00
Joe Garcia
327352feaf Add default value to webservice_id kwarg 2023-02-27 11:26:52 -05:00
Alan Rominger
ccaace8b30 Merge pull request #13541 from npithonDR/devel
Fix error for byweekday in schedule_rruleset
2023-02-27 10:24:48 -05:00
Hao Liu
2902b40084 Merge pull request #13623 from TheRealHaoLiu/revert-project-update-playbook
Revert project_update.yml
2023-02-27 08:47:24 -05:00
Hao Liu
9669b9dd2f Revert project_update.yml
Due to problem found in testing reverting

019e6a52fe
2023-02-27 08:23:27 -05:00
Shane McDonald
d27aada817 Merge pull request #13619 from shanemcd/non-root-path-dev-env
Allow serving app from non-root path in dev env
2023-02-24 09:52:34 -05:00
Shane McDonald
2fca07ee4c Allow serving app from non-root path in dev env
Usage:

$ EXTRA_SOURCES_ANSIBLE_OPTS='-e ingress_path=/awx' make docker-compose
$ curl http://localhost:8013/awx/api/v2/ping/
2023-02-24 09:29:17 -05:00
npithonDR
335ac636b5 Merge pull request #1 from AlanCoding/npithon
Follow comments, split non-list objects
2023-02-24 08:42:00 +01:00
Shane McDonald
f4bcc03ac7 Merge pull request #12242 from adpavlov/12241-websocket-custom-path
Fix websockets when application is served from a non-root path
2023-02-23 12:25:22 -05:00
Alan Rominger
3051384f95 Follow suggestion from comment, split if NOT list 2023-02-23 12:05:32 -05:00
Alan Rominger
811ecb8673 Follow suggestion from comment, split if NOT list 2023-02-23 12:05:21 -05:00
Alexander Pavlov
5e28f5dca1 Remove trailing $ from websocket_urlpatterns to work with custom path to fix #12241
Signed-off-by: Alexander Pavlov <alexander.pavlov@amdocs.com>
2023-02-23 12:02:47 -05:00
Hao Liu
d088d36448 Merge pull request #13618 from TheRealHaoLiu/head-to-tail
[fix] switch from head to tail in project update playbook when clearing project dir
2023-02-23 11:13:03 -05:00
Hao Liu
89e41597a6 switch from head to tail
from @relrod

`head` will close the input fd when it no longer needs it (or exits). find will try to write to the closed fd and somewhere along the way, it will receive SIGPIPE as a result. This is why `yes | head -5 ` doesn't run forever.
2023-02-23 10:46:48 -05:00
Hao Liu
283adc30a8 Merge pull request #13526 from TheRealHaoLiu/project_update_playbook_lint
[chore] Update project_update playbook to be compliant with ansible-lint
2023-02-22 21:39:42 -05:00
Hao Liu
019e6a52fe Update project_update playbook to be compliant with ansible-lint 2023-02-22 19:30:24 -05:00
Hao Liu
35e5610642 Merge pull request #13615 from TheRealHaoLiu/update-kind-devel-doc
update kind development environment instruction
2023-02-22 19:25:03 -05:00
Hao Liu
3a303875bb update kind development environment instruction 2023-02-22 16:18:53 -05:00
Alan Rominger
4499a50019 Merge pull request #13595 from sean-m-sullivan/devel
fix inventory prompt on launch for workflow nodes
2023-02-22 10:23:02 -05:00
Alan Rominger
3fe46e2e27 Merge pull request #13606 from AlanCoding/copy_login
Give proper 401 code to user not logged in
2023-02-21 16:31:23 -05:00
Alan Rominger
6d3f39fe92 Give proper 401 code to user not logged in 2023-02-21 13:34:29 -05:00
Alan Rominger
a3233b5fdd Merge pull request #13594 from AlanCoding/approval_collection
Add integration test and docs for workflow_approval module
2023-02-21 09:03:17 -05:00
sean-m-sullivan
fe3aa6ce2b fix inventory prompt on launch for workflow nodes 2023-02-18 23:13:46 -05:00
Gabriel Muniz
77ec46f6cf Merge pull request #13593 from gamuniz/fix_workflowapproval_view
Make /api/v2/workflow_approvals/ endpoint read-only
2023-02-17 18:19:04 -05:00
Alan Rominger
b5f240ce70 Add integration test and docs for workflow_approval module 2023-02-17 15:10:59 -05:00
Gabe Muniz
fb2647ff7b changing the signature of workflowapprovallist
included workflow approval as a read only endpoint to pass collection tests
2023-02-17 14:57:54 -05:00
Stanislav Zaprudskiy
35fbb94aa6 Use CLUSTER_HOST_ID as default hostname argument value
Incorporates feedback from https://github.com/ansible/awx/pull/13445/files#r1106012308

Signed-off-by: Stanislav Zaprudskiy <s.zaprudskiy@sap.com>
2023-02-17 18:10:08 +01:00
Stanislav Zaprudskiy
f2ab8d637c Do not discard jobs w/ .started=None 2023-02-17 18:10:08 +01:00
Stanislav Zaprudskiy
166b586591 Support indefinitely waiting for jobs to finish
Signed-off-by: Stanislav Zaprudskiy <s.zaprudskiy@sap.com>
2023-02-17 18:10:08 +01:00
Stanislav Zaprudskiy
d1c608a281 Reformat with black
Signed-off-by: Stanislav Zaprudskiy <s.zaprudskiy@sap.com>
2023-02-17 18:10:08 +01:00
Stanislav Zaprudskiy
b4803ca894 Add disable_instance management command
Signed-off-by: Stanislav Zaprudskiy <s.zaprudskiy@sap.com>
2023-02-17 18:10:08 +01:00
Tom Siewert
ce7f597c7e Makefile: Make docker-compose command configurable
docker-compose v1 is EOL since April 2022 and hasn't received any
updates since May 2021. docker compose v2 is a complete rewrite in
Go which acts as a plugin for the main docker application.
The syntax is the same, but only the `compose` command differs.
This commit adds the ability to override the default `docker-compose`
command using `make DOCKER_COMPOSE='docker compose'`.

Signed-off-by: Tom Siewert <tom@siewert.io>
2023-02-16 14:47:39 +01:00
John Westcott IV
23a34c5dc9 Merge pull request #13466 from john-westcott-iv/ee_debugging
Enhancing debugging of `The project could not sync because there is no Execution Environment`
2023-02-16 08:11:30 -05:00
John Westcott IV
bef3da6fb2 Merge pull request #13304 from john-westcott-iv/limit_actions
Only allow promote and stage to run on the awx repo
2023-02-16 08:05:23 -05:00
Alan Rominger
7f50679e68 Do not create setting with invalid value in data migration (#13576)
* Do not create setting with invalid value in data migration

* Add test for conf app data migration
2023-02-15 14:54:46 -05:00
John Westcott IV
52d071f9d1 Merge pull request #13573 from john-westcott-iv/ldap_issue
Fixing LDAP users not being properly added to managed teams
2023-02-15 13:25:34 -05:00
John Westcott IV
26a888547d Fixing variable with duplicate name which was causing errors with LDAP team addition 2023-02-14 14:56:13 -05:00
Shane McDonald
05af2972bf Merge pull request #13562 from siw36/fix-typo-generic-oidc
Fix a typo in the help text for Generic OIDC
2023-02-13 12:33:42 -05:00
Robin Klussmann
60458bebfd Fix a typo in the help text for Generic OIDC 2023-02-13 17:11:29 +01:00
npithonDR
951eee944c Add additional rruleset tests 2023-02-13 09:50:11 +01:00
npithonDR
4630757f5f Fix error for byweekday in schedule_rruleset
Fix error:
```
fatal: [localhost]: FAILED! => {
    "msg": "An unhandled exception occurred while running the lookup plugin 'awx.awx.schedule_rruleset'. Error was a <class 'ansible.errors.AnsibleError'>, original message: In rule 1 byweekday must only contain values in monday, tuesday, wednesday, thursday, friday, saturday, sunday. In rule 1 byweekday must only contain values in monday, tuesday, wednesday, thursday, friday, saturday, sunday"
}
```

with:
```
    - name: Build a complex schedule for every monday using the rruleset plugin
      awx.awx.schedule:
        name: "Test build complex schedule"
        state: present
        unified_job_template: "template name"
        rrule: "{{ query('awx.awx.schedule_rruleset', '2030-04-30 10:30:45', rules=rrules, timezone='Europe/Paris' ) }}"
      vars:
        rrules:
          - frequency: 'day'
            interval: 1
            byweekday: 'monday'
```
2023-02-09 09:34:10 +01:00
Hao Liu
46ea031566 Merge pull request #13539 from gamuniz/fix_dependent_schedule_export
[fix] adding Schedule to dependent_export to allow previous behavior on job template export
2023-02-08 17:04:35 -05:00
Gabe Muniz
0d7bbb4389 [AAP-8682] adding Schedule to dependent_export to allow previous behavior on job template export 2023-02-08 16:19:29 -05:00
Seth Foster
1dda373aaf Merge pull request #13528 from infamousjoeg/fix-13527-conjur-exception-bug
Fixes #13527 CyberArk Conjur Secrets Manager Lookup Exception Bug
2023-02-08 15:12:49 -05:00
Seth Foster
33c1968210 Merge pull request #13332 from fosterseth/update_clustering_md
Update clustering.md to be more current
2023-02-07 20:04:51 -05:00
Joe Garcia
049a158638 Fixes ansible/awx #13527 2023-02-07 10:47:51 -05:00
Sarah Akus
32f7295f44 Merge pull request #13247 from kialam/audit-fix-only
Fix high severity vulnerabilities.
2023-02-06 13:15:07 -05:00
Alan Rominger
6772fb876b Merge pull request #13522 from AlanCoding/no_events
Skip callback receiver bulk_create with 0 events
2023-02-06 12:02:20 -05:00
Alan Rominger
51112b95bc Add test for callback events flush with nothing in the buffer 2023-02-05 22:46:50 -05:00
Alan Rominger
6c1d4a5cfd Skip callback receiver bulk_create with 0 events 2023-02-04 12:10:39 -05:00
Alan Rominger
2e9106d8ea Merge pull request #13516 from AlanCoding/github_ci_runner
Attempt to consolidate CI logic with github_ci_runner target
2023-02-03 15:39:39 -05:00
Alan Rominger
84822784e8 Get rid of label because it is confusing 2023-02-03 14:24:43 -05:00
Alan Rominger
0f3adb52b1 Add help comments and reorg targets for separation 2023-02-03 14:24:43 -05:00
Alan Rominger
59da9a29df Delete everything about py_version in CI workflow 2023-02-03 14:24:43 -05:00
Alan Rominger
a949ee048a Consolidate CI logic with github_ci_runner target
Delete outright the step to install python

Fix typo that failed to label stage
2023-02-03 14:24:43 -05:00
John Westcott IV
b959bc278f Merge pull request #13475 from john-westcott-iv/add_m2m_unit_test
Adding functional test for LDAP _update_m2m_relationships
2023-02-03 10:59:45 -05:00
Lila Yasin
052644eb9d Merge pull request #13459 from djyasin/forwardport_deps_bump
Updating wheel and gitpython dependencies
2023-02-03 10:35:24 -05:00
Kia Lam
4e18827909 Add new licenses and remove old ones. 2023-02-02 14:34:59 -08:00
Kia Lam
59ce8c4148 Upgrade high and critial dependencies. 2023-02-02 14:07:28 -08:00
John Westcott IV
3b9c04bf1e Merge pull request #13515 from john-westcott-iv/fix_awx_collection_project_module
Fixing awx_collection sanity testing
2023-02-02 13:56:42 -05:00
John Westcott IV
f28203913f Fixing indentation in project module 2023-02-02 13:34:19 -05:00
Alan Rominger
9b2725e5fe Merge pull request #13500 from AlanCoding/group_options
Fix OPTIONS permissions bug in groups list
2023-02-02 12:55:04 -05:00
Alan Rominger
1af955d28c Merge pull request #13267 from philipsd6/feature/complex_extra_vars
Enable support for injecting complex extra vars
2023-02-02 10:13:49 -05:00
Rick Elrod
0815f935ca [collection] remove module defaults where API defaults are the same (#13037)
Providing defaults for API parameters where the API already provides
defaults leads to some confusing scenarios, because we end up always
sending those collection-defaulted fields in the request even if the
field isn't provided by the user.

For example, we previously set the `scm_type` default to 'manual' and
someone using the collection to update a project who does not explicitly
include the `scm_type` every time they call the module, would
inadvertently change the `scm_type` of the project back to 'manual'
which is surprising behavior.

This change removes the collection defaults for API parameters, unless
they differed from the API default. We let the API handle the defaults
or otherwise ignore fields not given by the user so that the user does
not end up changing unexpected fields when they use a module.

Signed-off-by: Rick Elrod <rick@elrod.me>
2023-02-01 15:37:08 -06:00
Alan Rominger
6997876da6 Fix OPTIONS permissions bug in groups list 2023-02-01 16:19:24 -05:00
Alan Rominger
93d84fe2c9 Merge pull request #13502 from AlanCoding/new_black
Update to comply with new black rules
2023-02-01 16:18:50 -05:00
Alan Rominger
f5785976be Update to comply with new black rules 2023-02-01 14:59:38 -05:00
Seth Foster
61c7d4e4ca Merge pull request #13455 from infamousjoeg/fix-13439-support-conjur-oss
Fixes #13439 Add exception handling for `/api` on url
2023-01-31 16:28:31 -05:00
Alan Rominger
a2f528e6e5 Fix syntax bug that came from fixing sanity tests (#13473) 2023-01-31 15:55:20 -05:00
Hao Liu
058ae132cf Merge pull request #13489 from gamuniz/add_management_command
adding new management command to allow failsafe enabling of local auth
2023-01-31 13:52:10 -05:00
Hao Liu
6483575437 Merge pull request #13379 from OscarBell/issue_13377
Fix verbosity parameter choices for ad_hoc_command module
2023-01-31 13:21:27 -05:00
Hao Liu
a15a23c1d3 Merge pull request #13483 from mahaffey/cli-add-order-by
add '--order_by' option to awx CLI
2023-01-31 13:13:52 -05:00
Gabe Muniz
ffdcb9f4dd fixed error in help dialog 2023-01-31 12:54:17 -05:00
Gabe Muniz
2d9da11443 refactored the code to pass both enable and disable flags 2023-01-30 21:07:17 -05:00
John Westcott IV
5ce6c14f74 Merge pull request #13490 from john-westcott-iv/tallyoh-update-saml.md
Update "one or more" fields in SAML documentation.
2023-01-30 15:53:06 -05:00
Sarah Akus
61748c072d Merge pull request #13450 from mabashian/re-add-workflow-approval-bulk-actions
Re-add workflow approval bulk actions to workflow approvals list
2023-01-30 15:30:12 -05:00
tallyoh
89dae3865d Update saml.md
According to latest documentation, role and value are now "one or more" fields. So they both need to be arrays.  Entering the json data as you have in this article doesn't work. But when I added the brackets, it then worked.  
Thank you
2023-01-30 15:26:54 -05:00
Michael Abashian
808ab9803e Re-add workflow approval bulk actions to workflow approvals list 2023-01-30 14:54:35 -05:00
Gabe Muniz
d64b6d4dfe adding new management command to allow failsafe enabling of local authenication for disaster recovery or in case 3rd party authenication becomes unavailable 2023-01-30 14:31:26 -05:00
Ryan Mahaffey
c9d931ceee add '--order-by' option as supplied by the awx api 2023-01-27 18:21:34 -08:00
John Westcott IV
8fb831d3de SAML enhancements (#13316)
* Moving reconcile_users_org_team_mappings into common library

* Renaming pipeline to social_pipeline

* Breaking out SAML and generic Social Auth

* Optimizing SMAL login process

* Moving extraction of org in teams from backends into sso/common.create_orgs_and_teams

* Altering saml_pipeline from testing

Prefixing all internal functions with _
Modified subfunctions to not return values but instead manipulate multable objects
Modified all functions to not add duplicate orgs to the orgs_to_create list

* Updating the common function to respect a teams organization name

* Added can_create flag to create_org_and_teams

This made testing easier and allows for any adapter with a flag the ability to simply pass it into a function

* Multiple changes to SAML pipeline

Removed orgs_to_create from being passed into user_team functions, common create orgs code will add any team orgs to list of orgs automatically

Passed SAML_AUTO_CREATE_OBJECTS flag into create_org_and_teams

Fix bug where we were looking at values instead of keys

Added loading of all teams if remove flag is set in update_user_teams_by_saml_attr

* Moving common items between SAML and Social into a 'base'

* Updating and adding testing

* Renamed get_or_create_with_default_galaxy_cred to get_or_create_org_...
2023-01-27 11:49:16 -03:00
Joe Garcia
64865af3bb Fix API Lint Failure - remove bare excepts 2023-01-26 16:27:29 -05:00
John Westcott IV
9f63c99bee Adding functional test for LDAP _update_m2m_relationships 2023-01-26 16:10:27 -05:00
anxstj
d7025a919c sso/backends: remove_* must not change the user (#13430)
_update_m2m_from_groups must return None if remove_* is false or empty,
because None indicates that the user permissions will not be changed.

related #13429
2023-01-26 17:38:43 -03:00
Gabe Muniz
dab7d91cff adding new management command to allow failsafe enabling of local authenication for disaster recovery or in case 3rd party authenication becomes unavailable 2023-01-26 14:11:17 -05:00
John Westcott IV
61821faa00 Merge pull request #13476 from john-westcott-iv/security_requested_change
Nominal change to the pr body check
2023-01-25 17:38:55 -05:00
John Westcott IV
c26d211ee0 Nominal change to the pr body check 2023-01-25 17:12:43 -05:00
Lila
6a79d19668 Removed duplicate liscense file. 2023-01-25 11:23:10 -05:00
Lila
47176cb31b regenerated .txt file. 2023-01-25 10:16:40 -05:00
John Westcott IV
5163795cc0 Merge pull request #13397 from ansible/djyasin-patch-1
Update triage_replies.md
2023-01-25 10:12:06 -05:00
Oscar
b0a4173545 13377: Choices list for verbosity parameter should be a list of integers
Signed-off-by: Oscar <oscar.bell@bell.local>
2023-01-25 08:47:13 +01:00
John Westcott IV
eb9431ee1f Fixing hard coded project 2023-01-24 13:50:07 -05:00
John Westcott IV
fd6605932a Adding exception if unable to find the controler plane ee 2023-01-24 13:50:07 -05:00
John Westcott IV
ea9c52aca6 Merge pull request #13461 from john-westcott-iv/no_galaxy_if_published
Two changes to GitHub promote action
2023-01-23 16:02:03 -05:00
John Westcott IV
a7ebce1fef Update .github/workflows/promote.yml
Co-authored-by: Rick Elrod <rick@elrod.me>
2023-01-23 15:43:44 -05:00
John Westcott IV
5de9cf748d Two changes to promote action
Perform a git reset --hard before attempting to release awxkit to pypi.
We found that something new in the process was causing an unexpected behavior if the git tree had any changes inside it.
It would cause a devel version to be created and used as part of the upload which pypi was refusing.

Collections can not easly be deleted from galaxy so if we have to rerun a job because of a pypi or quay failure we don't want to try and upload the collection again.
2023-01-23 15:37:02 -05:00
Jake Jackson
ebea78943d Deprecate tower modules (#13210)
* first deprecation pass, need to confirm date or version

* remove doc block updates as not needed, update runtime and remove symlinks

* add line to readme as notable release

* update version before release
2023-01-23 13:44:26 -05:00
Lila
bb387f939b Ran updater script to generate new requirements.txt file. 2023-01-23 11:58:26 -05:00
Satoe Imaishi
bda806fd03 Merge pull request #6276 from simaishi/43_bump_deps
[4.3] Bump python dependencies for security fixes
2023-01-23 11:43:20 -05:00
Alan Rominger
9777ce7fb8 Touchup of validation logic from testing 2023-01-23 11:01:08 -05:00
Seth Foster
1e33bc4020 Merge pull request #13338 from fosterseth/tag_awx_ee_on_release
tag awx-ee latest on awx release
2023-01-20 12:44:52 -05:00
Joe Garcia
d8e7c59fe8 change except to get response instead of raise error 2023-01-20 11:40:51 -05:00
Joe Garcia
4470b80059 Add exception handling for /api on url 2023-01-20 11:34:35 -05:00
Divided by Zer0
e9ad01e806 Handles workflow node schema inventory (#12721)
Verified by QE. Merging it.
2023-01-19 18:25:19 -03:00
Alan Rominger
8a4059d266 Workaround for events with NUL char, touch up error loop (#13398)
* Workaround for events with NUL char, touch up error loop

This fixes an error where some events would not save
  due to having the 0x00 character which errors in postgres
  this adds a line to replace it with empty text

Hitting that kind of event put us in an infinite error loop
  so this change makes a number of changes to prevent similar loops
  the showcase example is a negative counter,
  this is not realistic in the real world but works for unit tests

These error loop fixes seek to esablish the cases where we clear the buffer
Some logic is removed from the outer loop, with the idea that
ensure_connection will better distinguish flake

* From review comments, delay NUL char sanitization to later

Use pop to make list operations more clear

* Fix incorrect use of pop
2023-01-19 13:36:23 -05:00
Seth Foster
01a7076267 Merge pull request #13433 from kwevers/bugfix/hashicorp-vault-retries
Retry HashiCorp Vault requests on HTTP 412
2023-01-18 16:00:40 -05:00
Seth Foster
32b6aec66b Merge pull request #13444 from codygula/devel
Update to include pip install command and PyPI link. related #13179
2023-01-18 15:51:28 -05:00
John Westcott IV
884ab424d5 Merge pull request #12832 from no-12/allow_metrics_for_anonymous_users
Allow metrics collection for anonymous users via settings
2023-01-18 09:46:35 -05:00
Cody Gula
7e55305c45 Update to include pip install command and PyPI link
Signed-off-by: Cody Gula <cgula7@gmail.com>
2023-01-17 19:04:57 -08:00
Philip Douglass
7f6f57bfee Maintain nested context for validation error messages 2023-01-17 19:03:32 -05:00
Philip Douglass
ae92f8292f Account for validation context 2023-01-17 19:03:32 -05:00
Philip Douglass
51e244e183 Expand pattern to support use of Jinja2 block delimiters 2023-01-17 19:03:32 -05:00
Philip Douglass
ad4e257fdb Add functions to support recursive validation for extra_vars 2023-01-17 19:03:32 -05:00
Philip Douglass
fcf56950b3 Add recursive properties to injectors jsonschema for extra_vars 2023-01-17 19:03:32 -05:00
Philip Douglass
27ea239c00 Add two tests for nested and templated extra_vars keys 2023-01-17 19:03:32 -05:00
Philip Douglass
128a130b84 Update documentation to include subkey injection 2023-01-17 19:03:32 -05:00
Philip Douglass
d75f12c001 Render keys while walking extra_vars in addition to values 2023-01-17 19:03:32 -05:00
Philip Douglass
2034eac620 Add function to walk the extra_vars and render the results 2023-01-17 19:03:32 -05:00
Sarah Akus
e9a1582b70 Merge pull request #13262 from AlexSCorey/12429-PrepopulateResources
Prepopulates job template form with related resource
2023-01-17 17:43:02 -05:00
Alex Corey
51ef1e808d Prepopulates job template form with related resource 2023-01-17 13:10:07 -05:00
Lila Yasin
11fbfc2063 added fix for preserve existing children issue. (#13374)
* added fix for preserve existing children issue.

* Modified line 131 to call actual parm name.

* Removed line 132 after updating.
2023-01-16 11:36:07 -03:00
Kristof Wevers
f6395c69dd Retry HashiCorp Vault requests on HTTP 412
HC Vault clusters use eventual consistency and might return an HTTP 412
if the secret ID hasn't replicated yet to the replicas / standby nodes.
If this happens the request should be retried.

related #13413

Signed-off-by: Kristof Wevers <kristof.wevers@infura.eu>
2023-01-16 13:29:33 +01:00
kialam
ca07bc85cb Merge pull request #13367 from kialam/fix-13290-instance-404
Conditionally query /health_check endpoint for execution node only.
2023-01-12 13:20:35 -08:00
Seth Foster
b87dd6dc56 tag awx-ee latest with awx release 2023-01-11 17:21:51 -05:00
Seth Foster
f8d46d5e71 Merge pull request #13351 from jangel97/project_lokfile_timeout
add logging to situation in which project lock file is locked
2023-01-10 20:58:53 -05:00
Jose Angel Morena
ce0a456ecc add log message if unable to open lockfile
Signed-off-by: Jose Angel Morena <jmorenas@redhat.com>
2023-01-10 21:51:23 +01:00
Nico Ohnezat
5775ff1422 make help text of ALLOW_METRICS_FOR_ANONYMOUS_USERS more clear 2023-01-10 09:32:25 +01:00
Nico Ohnezat
82e8bcd2bb related #6753 allow metrics for anonymous users
Signed-off-by: Nico Ohnezat <nico@no-12.net>
2023-01-10 09:32:25 +01:00
John Westcott IV
d73cc501d5 Merge pull request #13342 from john-westcott-iv/reconcile_fix
Fixing bug in LDAP reconcile loop
2023-01-09 14:20:49 -05:00
John Westcott IV
7e40a4daed Refactoring code 2023-01-09 10:31:15 -05:00
John Westcott IV
47e824dd11 Fixing LDAP reconcile loop 2023-01-09 10:31:15 -05:00
Sarah Akus
4643b816fe Merge pull request #13075 from keithjgrant/13059-running-job-output-gap
Fix gap between API-loaded job events and WS-streamed events
2023-01-05 13:46:10 -05:00
Seth Foster
79d9329cfa Merge pull request #13403 from fosterseth/fix_console_colors
Fix console color logs
2023-01-05 13:34:13 -05:00
Seth Foster
6492c03965 Fix console color logs 2023-01-05 12:55:20 -05:00
Michael Abashian
98107301a5 Merge pull request #13194 from mabashian/13193-related-name-exact
Adds support for exact name searching against related fields to the ui
2023-01-05 10:20:39 -05:00
Keith J. Grant
4810099158 update test 2023-01-05 09:56:37 -05:00
Michael Abashian
1aca9929ab Adds support for exact name searching against related fields to the ui 2023-01-05 09:56:37 -05:00
Sarah Akus
2aa58bc17d Merge pull request #13372 from vidyanambiar/aap-7757
Fix for Save button not responding on Job Settings page
2023-01-04 13:39:55 -05:00
Lila Yasin
be4b826259 Update triage_replies.md 2023-01-04 11:36:33 -05:00
Shane McDonald
b99a434dee Merge pull request #13395 from shanemcd/pin-rsyslog
Pin rsyslog to avoid crash
2023-01-04 21:54:34 +08:00
Shane McDonald
6cee99a9f9 Pin rsyslog to prevent crash
With the latest version of rsyslog we had a test failing with:

AssertionError: Response data: {'error': "b'rsyslog internal message (3,-2455): could not transfer  the  specified  internal posix  capabilities settings to the kernel, capng_apply=-5\\n [v8.2102.0-107.el9 try https://www.rsyslog.com/e/2455 ]\\n'"}

Downgrading fixes it
2023-01-04 08:19:20 -05:00
Seth Foster
ee509aea56 Merge pull request #12961 from fosterseth/fix_results_traceback
Result_traceback should not include job stdout
2023-01-03 13:34:23 -05:00
Sarah Akus
b5452a48f8 Merge pull request #13196 from keithjgrant/13189-job-traceback
Fix job error traceback in job output
2023-01-03 11:59:58 -05:00
Vidya Nambiar
68e555824d Fix for Save button not responding on Job Settings page
Signed-off-by: Vidya Nambiar <vnambiar@redhat.com>
2022-12-22 11:23:03 -05:00
Seth Foster
0c980fa7d5 Merge pull request #13366 from fosterseth/bump_receptorctl_1.3.0
bump receptorctl version to 1.3.0
2022-12-21 16:27:25 -05:00
Shane McDonald
e34ce8c795 Merge pull request #13365 from dsavineau/downgrade_hiredis
Pin hiredis to 2.0.0
2022-12-21 15:23:15 -05:00
Kia Lam
58bad6cfa9 Conditionally query /health_check endpoint for execution node only. 2022-12-21 10:44:12 -08:00
Seth Foster
3543644e0e bump receptorctl version to 1.3.0 2022-12-21 13:36:11 -05:00
Seth Foster
36c0d07b30 Result_traceback should not include job stdout
If a job fails, we do receptor work results and put that output
into result_traceback.

We should only do this if
1. Receptor unit has failed
2. Runner callback processed 0 events

Otherwise we risk putting too much data into this field.
2022-12-21 13:05:44 -05:00
Keith J. Grant
03b0281fde clean up follow mode quirks 2022-12-21 09:30:35 -08:00
Keith J. Grant
6f6f04a071 refresh events when first websocket event streams 2022-12-21 09:30:35 -08:00
Dimitri Savineau
239827a9cf Pin hiredis to 2.0.0
The hiredis 2.1.0 release doesn't provide source distribution on PyPi so
users can't build that python package from sources.

Signed-off-by: Dimitri Savineau <dsavinea@redhat.com>
2022-12-21 11:57:41 -05:00
Alan Rominger
ac9871b36f Merge pull request #13361 from relrod/sanity
[collection] Run sanity tests outside of our container
2022-12-21 11:00:21 -05:00
Alan Rominger
f739908ccf Add comment about Ansible-core being installed by default
Co-authored-by: John R Barker <john@johnrbarker.com>
2022-12-21 09:57:00 -05:00
Alan Rominger
cf1ec07eab Changes to run sanity tests locally
Use a Makefile arg for the ansible-test sanity CLI args
  defaults to --docker
  in the future we probably need to customize python versions

Copy the rule exception for Ansible 2.15
  this helps people who are running from Ansible devel
2022-12-21 09:53:22 -05:00
Rick Elrod
d968b648de Run sanity tests outside of our container
Also just ignore one sanity test for the export module, instead of
ignoring all of them.

Also use latest ansible-test, and make it work on GHA (by using podman
instead of docker).

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-12-20 21:40:41 -06:00
Rick Elrod
5dd0eab806 Pin channels-redis to 4.3.1 to fix an async issue (#13348)
Refs django/channels_redis#332
Refs #13313

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-12-20 17:05:44 -06:00
Alan Rominger
41f3f381ec Merge pull request #13352 from AlanCoding/dont_pass_subtasks
Remove `subtasks` keyword arg that can exceed pg_notify max message length
2022-12-20 16:25:39 -05:00
Alan Rominger
ac8cff75ce Run collection sanity tests in CI (#13356)
* Run collection sanity tests in CI

This requires adding a Makefile install of ansible-core

Fake the version to make semver check happy

* Fixes from ansible-test sanity failures

* Exclude the export module due to awxkit requirement

* Fix broken ansible-test rule exceptions

remove Ansible 2.14 exclusions that make ansible-test ERROR, saying they are not needed
2022-12-20 16:06:25 -05:00
Alan Rominger
94b34b801c Avoid unbounded kwargs by fetching subtasks inside handle_work_error
Update tests to new handle_work_error call pattern

Handle blame correctly with multiple serial deps
  add new test case corresponding to this scenario
2022-12-19 16:02:51 -05:00
Jeff Bradberry
8f6849fc22 Include listener_port in the defaults for Instance.objects.register (#13328) 2022-12-19 14:16:05 -03:00
Sarah Akus
821b1701bf Merge pull request #13340 from gamuniz/change_wf_scmbranch_behavior
Change workflow create/edit to null scm_branch when not provided.
2022-12-19 10:52:21 -05:00
John Westcott IV
b7f2825909 Throw a warning if custom secret key was specified but not given (#13128)
* Throw a warning if custom secret key was specified but not given

* Fixing unit tests
2022-12-17 14:15:27 -03:00
Jeff Bradberry
e87e041a2a Break up and conditionally add the RBAC checks for ActivityStream (#13279)
This should vastly improve the queries executed when accessing any of
the activity stream endpoints as a normal user, in many cases.
2022-12-16 15:11:14 -03:00
Gabe Muniz
cc336e791c fix expected test result 2022-12-16 12:30:57 -05:00
Gabe Muniz
c2a3c3b285 The current behavior of workflow job templates is to pass in an empty string as scm_branch on allsaves and edits. This becomes problematic when using job templates/workflows which allow prompt on launch for scm_branch as it may override the scm_branch set for the individual workflow nodes to an empty string. That behavior limits the usefulness of prompting scm branch as it can no longer by selected while creating workflows as they'll be overwritten. 2022-12-16 12:30:57 -05:00
Jeff Bradberry
7b8dcc98e7 Merge pull request #13308 from jbradberry/rebuild-org-ee-admin-roles
Ensure that the Organization.execution_environment_admin_role always gets built
2022-12-16 11:29:20 -05:00
Satoe Imaishi
d5011492bf Merge pull request #13343 from simaishi/add_pkgconfig
Add back pkgconfig for offline build
2022-12-16 08:07:38 -05:00
Satoe Imaishi
e363ddf470 Add back pkgconfig for offline build 2022-12-15 20:49:28 -05:00
Shane McDonald
987709cdb3 Merge pull request #13344 from shanemcd/fix-tox
Remove unneeded pass_env in tox config
2022-12-15 20:02:31 -05:00
Shane McDonald
f04ac3c798 Remove unneeded pass_env in tox config
I don't recall us ever using Travis so I'm not sure why this is here.

https://tox.wiki/en/latest/changelog.html#v4-0-6-2022-12-10
2022-12-15 19:44:02 -05:00
Jake Jackson
71a6baccdb Fix lookup plugins sanity (#13238)
* fix pytz

* fix NameError

* fix tests and add sanity ignore files for import test until distutils replaced

* change static method to regular method and update test to instantiate class
2022-12-15 16:40:51 -05:00
Alan Rominger
d07076b686 Merge pull request #13330 from AlanCoding/ask_me_for_tags
Fill in rest of ask_tags handling for WFJT module
2022-12-15 10:59:17 -05:00
John Westcott IV
7129f3e8cd Updating python3-saml (#13263)
Moved to forked version to get latest lxml to allow other pacakges to update
2022-12-15 12:15:09 -03:00
Julen Landa Alustiza
df61a5cea1 Merge pull request #13126 from infamousjoeg/cyberark-ccp-branding-webserviceid
CyberArk Central Credential Provider Lookup custom Web Service ID & update branding
2022-12-15 15:54:35 +01:00
Ilija Matoski
a4b950f79b Set AWS_SESSION_TOKEN in addition to AWS_SECURITY_TOKEN (#13297)
* Set AWS_SESSION_TOKEN in addition to AWS_SECURITY_TOKEN

* added AWS_SESSION_TOKEN to inventoryupdate-1 test
2022-12-15 10:09:40 -03:00
Seth Foster
1d87e6e04c Update clustering.md to be more current 2022-12-14 22:36:29 -05:00
Sarah Akus
8be739d255 Merge pull request #13306 from vidyanambiar/aap-7507
Fixes 'Not Found' error on looking up credentials
2022-12-14 16:13:55 -05:00
John Westcott IV
ca54195099 Merge pull request #13324 from mannyci/devel
Fix typo in controller_api lookup plugin
2022-12-14 15:19:53 -05:00
Alex Corey
f0fcfdde39 Merge pull request #13257 from ansible/dependabot/npm_and_yarn/awx/ui/devel/luxon-3.1.1
Bump luxon from 3.0.3 to 3.1.1 in /awx/ui
2022-12-14 09:19:47 -05:00
Alex Corey
80b1ba4a35 Merge pull request #13259 from ansible/dependabot/npm_and_yarn/awx/ui/devel/patternfly/react-core-4.264.0
Bump @patternfly/react-core from 4.250.1 to 4.264.0 in /awx/ui
2022-12-14 09:13:32 -05:00
Alan Rominger
51f8e362dc Add tags prompt to integration test 2022-12-14 09:10:15 -05:00
Sarah Akus
737d6d8c8b Merge pull request #13329 from akus062381/add-new-triage-reply
add new triage reply
2022-12-13 16:45:16 -05:00
Alan Rominger
beaf6b6058 Fill in rest of ask_tags handling for WFJT module 2022-12-13 16:38:25 -05:00
akus062381
aad1fbcef8 add new triage reply 2022-12-13 16:17:42 -05:00
Rick Elrod
0b96d617ac Fix BROADCAST_WEBSOCKET_PORT for Kube dev (#13243)
- `settings/minikube.py` gets imported conditionally, when the
  environment variable `AWX_KUBE_DEVEL` is set. In this imported file,
  we set `BROADCAST_WEBSOCKET_PORT = 8013`, but 8013 is only used in the
  docker-compose dev environment. In Kubernetes environments, 8052 is
  used for everything. This is hardcoded awx-operator's ConfigMap.

- Also rename `minikube.py` because it is used for every kind of
  development Kube environment, including Kind.

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-12-13 15:07:15 -06:00
Alan Rominger
fe768a159b Merge pull request #13295 from AlanCoding/raw_instance_data
Remove un-editable Instance fields from pre-filled edit data in API browser
2022-12-13 15:16:34 -05:00
Alan Rominger
c1ebea858b Merge pull request #13291 from AlanCoding/policy_want_a_cracker
Add missing disassociate trigger for policy task
2022-12-13 11:35:22 -05:00
Seth Foster
da9b8135e8 Merge pull request #13315 from fosterseth/update_task_manager_md
update task manager docs after refactoring
2022-12-12 12:42:49 -05:00
Elijah DeLee
76cecf3f6b update capacity docs to cover hybrid node case
this came up in conversation and I saw this was not in this doc as an example
2022-12-12 12:11:56 -05:00
Manas Maiti
7b2938f515 fix typo 2022-12-12 18:01:15 +01:00
Seth Foster
916b5642d2 Update task manager docs
- DependencyManager and WorkflowManager
- bulk reschedule
- global task manager timeout
- blocking logic

Co-authored-by: Elijah DeLee <kdelee@redhat.com>
Co-authored-by: John R Barker <john@johnrbarker.com>
2022-12-12 11:56:40 -05:00
Jeff Bradberry
e524d3df3e Replace the role fixup post_migrate handler with a data migration 2022-12-12 10:20:56 -05:00
Rick Elrod
01e9a611ea Add broadcast_websocket to LOG_AGGREGATOR_LOGGERS
... so that errors from it get logged to external loggers by default.

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-12-08 17:50:20 -06:00
John Westcott IV
5d96ee084d Adding endswith(awx) to stage 2022-12-08 16:36:04 -05:00
John Westcott IV
e2cee10767 Update .github/workflows/promote.yml
Co-authored-by: Shane McDonald <me@shanemcd.com>
2022-12-08 16:34:13 -05:00
Rick Elrod
ef29589940 Fix duped stats name and Redis for wsbroadcast
This fixes several things related to our wsbroadcast stats handling.
This was found during the ongoing wsrelay work.

There are really three fixes here:

- Logging was not actually enabled for the analytics.broadcast_websocket
  module, so that has been added to our loggers config.

- analytics.broadcast_websocket was not actually able to connect to
  Redis due to 68614b83c0 as part of
  the work in #13187. But there was no easy way to know this because the
  logging issue meant no exceptions showed up anywhere reasonable.

- Relatedly, and also as part of #13187, we jumped from
  `prometheus-client` 0.7.1 up to 0.15.0. This included a breaking
  change where a `Counter` ending with `_total` will clash with a
  `Gauge` of the same name but without `_total`. I am not 100% sure of
  the reasoning here, other than "OpenMetrics compatibility".

Refs #13301
Refs #13187

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-12-08 12:54:08 -06:00
Vidya Nambiar
cec2d2dfb9 minor rearrangement of imports
Signed-off-by: Vidya Nambiar <vnambiar@redhat.com>
2022-12-08 10:52:20 -05:00
Jeff Bradberry
15b7ad3570 Add a post_migrate signal handler to rebuild the Org roles
particularly, the execution_environment_admin_role.
2022-12-07 15:57:20 -05:00
Vidya Nambiar
36ff9cbc6d revert change to package.json
Signed-off-by: Vidya Nambiar <vnambiar@redhat.com>
2022-12-07 15:03:40 -05:00
Vidya Nambiar
ed74d80ecb Fixes 'Not Found' error on looking up credentials
remove redundant console logs

typo

Signed-off-by: Vidya Nambiar <vnambiar@redhat.com>
2022-12-07 15:00:28 -05:00
John Westcott IV
31c2e1a450 Only allow promote and stage to run on the awx repo 2022-12-07 14:09:36 -05:00
Alan Rominger
a0b8215c06 Merge pull request #13296 from AlanCoding/signing_bug
Fix bug, sign work based signing, not verification
2022-12-07 08:44:57 -05:00
Alan Rominger
f88b993b18 Fix bug, sign work based signing, not verification 2022-12-06 16:21:17 -05:00
Alan Rominger
4a7f4d0ed4 Remove uneditable Instance fields from API browser 2022-12-06 15:20:04 -05:00
Alan Rominger
6e08c3567f Add missing disassociate trigger for policy task 2022-12-06 14:43:13 -05:00
Jeff Bradberry
adbcb5c5e4 Merge pull request #13289 from jbradberry/improve-psql-paging
Make sure that the psql pager does not clear the screen afterwards
2022-12-06 13:17:24 -05:00
Jeff Bradberry
8054c6aedc Make sure that the psql pager does not clear the screen afterwards
Also, avoid paging if there is a single page.
2022-12-06 10:46:47 -05:00
dependabot[bot]
58734a33c4 Bump @patternfly/react-core from 4.250.1 to 4.264.0 in /awx/ui
Bumps [@patternfly/react-core](https://github.com/patternfly/patternfly-react) from 4.250.1 to 4.264.0.
- [Release notes](https://github.com/patternfly/patternfly-react/releases)
- [Commits](https://github.com/patternfly/patternfly-react/compare/@patternfly/react-core@4.250.1...@patternfly/react-core@4.264.0)

---
updated-dependencies:
- dependency-name: "@patternfly/react-core"
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-12-06 15:33:23 +00:00
dependabot[bot]
2832f28014 Bump luxon from 3.0.3 to 3.1.1 in /awx/ui
Bumps [luxon](https://github.com/moment/luxon) from 3.0.3 to 3.1.1.
- [Release notes](https://github.com/moment/luxon/releases)
- [Changelog](https://github.com/moment/luxon/blob/master/CHANGELOG.md)
- [Commits](https://github.com/moment/luxon/compare/3.0.3...3.1.1)

---
updated-dependencies:
- dependency-name: luxon
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-12-06 15:30:50 +00:00
Seth Foster
e5057691ee Merge pull request #13252 from max-len/patch-1
Update install.rst
2022-12-02 22:46:26 -05:00
Shane McDonald
a0cfd8501c Merge pull request #13274 from rooftopcellist/fix-messages-cmd
Fix make messages target by specify lang
2022-12-02 19:04:09 -05:00
Shane McDonald
99b643bd77 Merge pull request #13268 from simaishi/fix_static
Copy UI static files to /var/lib/awx only for ui-devel build
2022-12-02 19:03:48 -05:00
Sarah Akus
305b39d8e5 Merge pull request #13209 from marshmalien/5990-related-group-column
Add inventory host list related groups column
2022-12-02 16:23:09 -05:00
Christian M. Adams
642003e207 Fix make messages target by specify lang 2022-12-02 10:46:16 -05:00
Satoe Imaishi
06daebbecf Copy UI static files to /var/lib/awx only for ui-devel build 2022-12-01 08:58:05 -05:00
Max Lendrich
eaccf32aa3 Update install.rst
Fix doc for current pip==22.3
2022-11-30 16:54:42 +01:00
Marliana Lara
f0481d0a60 Add inventory host list related groups column 2022-11-21 12:04:40 -05:00
Keith J. Grant
d34f6af830 fix traceback offset/counter # in UI 2022-11-15 13:35:14 -08:00
Joe Garcia
f9bb26ad33 Merge branch 'devel' into cyberark-ccp-branding-webserviceid 2022-11-10 20:50:02 -05:00
Joe Garcia
878035c13b Fixed webservice_id check to string 2022-10-26 12:45:59 -04:00
Joe Garcia
2cc971a43f default to AIMWebService if no val provided 2022-10-26 12:41:15 -04:00
Joe Garcia
9d77c54612 Remove references to AIM everywhere 2022-10-26 12:32:12 -04:00
Joe Garcia
ef651a3a21 Add Web Service ID & update branding 2022-10-26 11:54:09 -04:00
531 changed files with 6943 additions and 4773 deletions

View File

@@ -53,6 +53,16 @@ https://github.com/ansible/awx/#get-involved \
Thank you once again for this and your interest in AWX!
### Red Hat Support Team
- Hi! \
\
It appears that you are using an RPM build for RHEL. Please reach out to the Red Hat support team and submit a ticket. \
\
Here is the link to do so: \
\
https://access.redhat.com/support \
\
Thank you for your submission and for supporting AWX!
## Common
@@ -96,6 +106,13 @@ The Ansible Community is looking at building an EE that corresponds to all of th
### Oracle AWX
We'd be happy to help if you can reproduce this with AWX since we do not have Oracle's Linux Automation Manager. If you need help with this specific version of Oracles Linux Automation Manager you will need to contact your Oracle for support.
### Community Resolved
Hi,
We are happy to see that it appears a fix has been provided for your issue, so we will go ahead and close this ticket. Please feel free to reopen if any other problems arise.
<name of community member who helped> thanks so much for taking the time to write a thoughtful and helpful response to this issue!
### AWX Release
Subject: Announcing AWX Xa.Ya.za and AWX-Operator Xb.Yb.zb

View File

@@ -1,8 +1,10 @@
---
name: CI
env:
BRANCH: ${{ github.base_ref || 'devel' }}
LC_ALL: "C.UTF-8" # prevent ERROR: Ansible could not initialize the preferred locale: unsupported locale setting
CI_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DEV_DOCKER_TAG_BASE: ghcr.io/${{ github.repository_owner }}
COMPOSE_TAG: ${{ github.base_ref || 'devel' }}
on:
pull_request:
jobs:
@@ -18,85 +20,33 @@ jobs:
tests:
- name: api-test
command: /start_tests.sh
label: Run API Tests
- name: api-lint
command: /var/lib/awx/venv/awx/bin/tox -e linters
label: Run API Linters
- name: api-swagger
command: /start_tests.sh swagger
label: Generate API Reference
- name: awx-collection
command: /start_tests.sh test_collection_all
label: Run Collection Tests
- name: api-schema
label: Check API Schema
command: /start_tests.sh detect-schema-change SCHEMA_DIFF_BASE_BRANCH=${{ github.event.pull_request.base.ref }}
- name: ui-lint
label: Run UI Linters
command: make ui-lint
- name: ui-test-screens
label: Run UI Screens Tests
command: make ui-test-screens
- name: ui-test-general
label: Run UI General Tests
command: make ui-test-general
steps:
- uses: actions/checkout@v2
- name: Get python version from Makefile
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
- name: Run check ${{ matrix.tests.name }}
run: AWX_DOCKER_CMD='${{ matrix.tests.command }}' make github_ci_runner
- name: Install python ${{ env.py_version }}
uses: actions/setup-python@v2
with:
python-version: ${{ env.py_version }}
- name: Log in to registry
run: |
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
- name: Pre-pull image to warm build cache
run: |
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
- name: Build image
run: |
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
- name: ${{ matrix.texts.label }}
run: |
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} ${{ matrix.tests.command }}
dev-env:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Get python version from Makefile
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
- name: Install python ${{ env.py_version }}
uses: actions/setup-python@v2
with:
python-version: ${{ env.py_version }}
- name: Log in to registry
run: |
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
- name: Pre-pull image to warm build cache
run: |
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
- name: Build image
run: |
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
- name: Run smoke test
run: |
export DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }}
export COMPOSE_TAG=${{ env.BRANCH }}
ansible-playbook tools/docker-compose/ansible/smoke-test.yml -e repo_dir=$(pwd) -v
run: make github_ci_setup && ansible-playbook tools/docker-compose/ansible/smoke-test.yml -v
awx-operator:
runs-on: ubuntu-latest
@@ -145,3 +95,22 @@ jobs:
env:
AWX_TEST_IMAGE: awx
AWX_TEST_VERSION: ci
collection-sanity:
name: awx_collection sanity
runs-on: ubuntu-latest
strategy:
fail-fast: false
steps:
- uses: actions/checkout@v2
# The containers that GitHub Actions use have Ansible installed, so upgrade to make sure we have the latest version.
- name: Upgrade ansible-core
run: python3 -m pip install --upgrade ansible-core
- name: Run sanity tests
run: make test_collection_sanity
env:
# needed due to cgroupsv2. This is fixed, but a stable release
# with the fix has not been made yet.
ANSIBLE_TEST_PREFER_PODMAN: 1

View File

@@ -6,7 +6,7 @@ env:
on:
pull_request_target:
types: [labeled]
jobs:
jobs:
e2e-test:
if: contains(github.event.pull_request.labels.*.name, 'qe:e2e')
runs-on: ubuntu-latest
@@ -107,5 +107,3 @@ jobs:
with:
name: AWX-logs-${{ matrix.job }}
path: make-docker-compose-output.log

View File

@@ -17,9 +17,9 @@ jobs:
env:
PR_BODY: ${{ github.event.pull_request.body }}
run: |
echo $PR_BODY | grep "Bug, Docs Fix or other nominal change" > Z
echo $PR_BODY | grep "New or Enhanced Feature" > Y
echo $PR_BODY | grep "Breaking Change" > X
echo "$PR_BODY" | grep "Bug, Docs Fix or other nominal change" > Z
echo "$PR_BODY" | grep "New or Enhanced Feature" > Y
echo "$PR_BODY" | grep "Breaking Change" > X
exit 0
# We exit 0 and set the shell to prevent the returns from the greps from failing this step
# See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#exit-codes-and-error-action-preference

View File

@@ -10,6 +10,7 @@ on:
jobs:
promote:
if: endsWith(github.repository, '/awx')
runs-on: ubuntu-latest
steps:
- name: Checkout awx
@@ -38,9 +39,13 @@ jobs:
- name: Build collection and publish to galaxy
run: |
COLLECTION_TEMPLATE_VERSION=true COLLECTION_NAMESPACE=${{ env.collection_namespace }} make build_collection
ansible-galaxy collection publish \
--token=${{ secrets.GALAXY_TOKEN }} \
awx_collection_build/${{ env.collection_namespace }}-awx-${{ github.event.release.tag_name }}.tar.gz
if [ "$(curl --head -sw '%{http_code}' https://galaxy.ansible.com/download/${{ env.collection_namespace }}-awx-${{ github.event.release.tag_name }}.tar.gz | tail -1)" == "302" ] ; then \
echo "Galaxy release already done"; \
else \
ansible-galaxy collection publish \
--token=${{ secrets.GALAXY_TOKEN }} \
awx_collection_build/${{ env.collection_namespace }}-awx-${{ github.event.release.tag_name }}.tar.gz; \
fi
- name: Set official pypi info
run: echo pypi_repo=pypi >> $GITHUB_ENV
@@ -52,6 +57,7 @@ jobs:
- name: Build awxkit and upload to pypi
run: |
git reset --hard
cd awxkit && python3 setup.py bdist_wheel
twine upload \
-r ${{ env.pypi_repo }} \
@@ -74,4 +80,6 @@ jobs:
docker tag ghcr.io/${{ github.repository }}:${{ github.event.release.tag_name }} quay.io/${{ github.repository }}:latest
docker push quay.io/${{ github.repository }}:${{ github.event.release.tag_name }}
docker push quay.io/${{ github.repository }}:latest
docker pull ghcr.io/${{ github.repository_owner }}/awx-ee:${{ github.event.release.tag_name }}
docker tag ghcr.io/${{ github.repository_owner }}/awx-ee:${{ github.event.release.tag_name }} quay.io/${{ github.repository_owner }}/awx-ee:${{ github.event.release.tag_name }}
docker push quay.io/${{ github.repository_owner }}/awx-ee:${{ github.event.release.tag_name }}

View File

@@ -21,6 +21,7 @@ on:
jobs:
stage:
if: endsWith(github.repository, '/awx')
runs-on: ubuntu-latest
permissions:
packages: write
@@ -84,6 +85,20 @@ jobs:
-e push=yes \
-e awx_official=yes
- name: Log in to GHCR
run: |
echo ${{ secrets.GITHUB_TOKEN }} | docker login ghcr.io -u ${{ github.actor }} --password-stdin
- name: Log in to Quay
run: |
echo ${{ secrets.QUAY_TOKEN }} | docker login quay.io -u ${{ secrets.QUAY_USER }} --password-stdin
- name: tag awx-ee:latest with version input
run: |
docker pull quay.io/ansible/awx-ee:latest
docker tag quay.io/ansible/awx-ee:latest ghcr.io/${{ github.repository_owner }}/awx-ee:${{ github.event.inputs.version }}
docker push ghcr.io/${{ github.repository_owner }}/awx-ee:${{ github.event.inputs.version }}
- name: Build and stage awx-operator
working-directory: awx-operator
run: |
@@ -103,6 +118,7 @@ jobs:
env:
AWX_TEST_IMAGE: ${{ github.repository }}
AWX_TEST_VERSION: ${{ github.event.inputs.version }}
AWX_EE_TEST_IMAGE: ghcr.io/${{ github.repository_owner }}/awx-ee:${{ github.event.inputs.version }}
- name: Create draft release for AWX
working-directory: awx

View File

@@ -1,4 +1,5 @@
PYTHON ?= python3.9
DOCKER_COMPOSE ?= docker-compose
OFFICIAL ?= no
NODE ?= node
NPM_BIN ?= npm
@@ -6,7 +7,20 @@ CHROMIUM_BIN=/tmp/chrome-linux/chrome
GIT_BRANCH ?= $(shell git rev-parse --abbrev-ref HEAD)
MANAGEMENT_COMMAND ?= awx-manage
VERSION := $(shell $(PYTHON) tools/scripts/scm_version.py)
COLLECTION_VERSION := $(shell $(PYTHON) tools/scripts/scm_version.py | cut -d . -f 1-3)
# ansible-test requires semver compatable version, so we allow overrides to hack it
COLLECTION_VERSION ?= $(shell $(PYTHON) tools/scripts/scm_version.py | cut -d . -f 1-3)
# args for the ansible-test sanity command
COLLECTION_SANITY_ARGS ?= --docker
# collection unit testing directories
COLLECTION_TEST_DIRS ?= awx_collection/test/awx
# collection integration test directories (defaults to all)
COLLECTION_TEST_TARGET ?=
# args for collection install
COLLECTION_PACKAGE ?= awx
COLLECTION_NAMESPACE ?= awx
COLLECTION_INSTALL = ~/.ansible/collections/ansible_collections/$(COLLECTION_NAMESPACE)/$(COLLECTION_PACKAGE)
COLLECTION_TEMPLATE_VERSION ?= false
# NOTE: This defaults the container image version to the branch that's active
COMPOSE_TAG ?= $(GIT_BRANCH)
@@ -52,7 +66,7 @@ I18N_FLAG_FILE = .i18n_built
sdist \
ui-release ui-devel \
VERSION PYTHON_VERSION docker-compose-sources \
.git/hooks/pre-commit
.git/hooks/pre-commit github_ci_setup github_ci_runner
clean-tmp:
rm -rf tmp/
@@ -190,19 +204,7 @@ uwsgi: collectstatic
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
uwsgi -b 32768 \
--socket 127.0.0.1:8050 \
--module=awx.wsgi:application \
--home=/var/lib/awx/venv/awx \
--chdir=/awx_devel/ \
--vacuum \
--processes=5 \
--harakiri=120 --master \
--no-orphans \
--max-requests=1000 \
--stats /tmp/stats.socket \
--lazy-apps \
--logformat "%(addr) %(method) %(uri) - %(proto) %(status)"
uwsgi /etc/tower/uwsgi.ini
awx-autoreload:
@/awx_devel/tools/docker-compose/awx-autoreload /awx_devel/awx "$(DEV_RELOAD_COMMAND)"
@@ -288,19 +290,28 @@ test:
cd awxkit && $(VENV_BASE)/awx/bin/tox -re py3
awx-manage check_migrations --dry-run --check -n 'missing_migration_file'
COLLECTION_TEST_DIRS ?= awx_collection/test/awx
COLLECTION_TEST_TARGET ?=
COLLECTION_PACKAGE ?= awx
COLLECTION_NAMESPACE ?= awx
COLLECTION_INSTALL = ~/.ansible/collections/ansible_collections/$(COLLECTION_NAMESPACE)/$(COLLECTION_PACKAGE)
COLLECTION_TEMPLATE_VERSION ?= false
## Login to Github container image registry, pull image, then build image.
github_ci_setup:
# GITHUB_ACTOR is automatic github actions env var
# CI_GITHUB_TOKEN is defined in .github files
echo $(CI_GITHUB_TOKEN) | docker login ghcr.io -u $(GITHUB_ACTOR) --password-stdin
docker pull $(DEVEL_IMAGE_NAME) || : # Pre-pull image to warm build cache
make docker-compose-build
## Runs AWX_DOCKER_CMD inside a new docker container.
docker-runner:
docker run -u $(shell id -u) --rm -v $(shell pwd):/awx_devel/:Z --workdir=/awx_devel $(DEVEL_IMAGE_NAME) $(AWX_DOCKER_CMD)
## Builds image and runs AWX_DOCKER_CMD in it, mainly for .github checks.
github_ci_runner: github_ci_setup docker-runner
test_collection:
rm -f $(shell ls -d $(VENV_BASE)/awx/lib/python* | head -n 1)/no-global-site-packages.txt
if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi && \
pip install ansible-core && \
if ! [ -x "$(shell command -v ansible-playbook)" ]; then pip install ansible-core; fi
ansible --version
py.test $(COLLECTION_TEST_DIRS) -v
# The python path needs to be modified so that the tests can find Ansible within the container
# First we will use anything expility set as PYTHONPATH
@@ -330,8 +341,13 @@ install_collection: build_collection
rm -rf $(COLLECTION_INSTALL)
ansible-galaxy collection install awx_collection_build/$(COLLECTION_NAMESPACE)-$(COLLECTION_PACKAGE)-$(COLLECTION_VERSION).tar.gz
test_collection_sanity: install_collection
cd $(COLLECTION_INSTALL) && ansible-test sanity
test_collection_sanity:
rm -rf awx_collection_build/
rm -rf $(COLLECTION_INSTALL)
if ! [ -x "$(shell command -v ansible-test)" ]; then pip install ansible-core; fi
ansible --version
COLLECTION_VERSION=1.0.0 make install_collection
cd $(COLLECTION_INSTALL) && ansible-test sanity $(COLLECTION_SANITY_ARGS)
test_collection_integration: install_collection
cd $(COLLECTION_INSTALL) && ansible-test integration $(COLLECTION_TEST_TARGET)
@@ -389,18 +405,18 @@ $(UI_BUILD_FLAG_FILE):
$(PYTHON) tools/scripts/compilemessages.py
$(NPM_BIN) --prefix awx/ui --loglevel warn run compile-strings
$(NPM_BIN) --prefix awx/ui --loglevel warn run build
mkdir -p /var/lib/awx/public/static/css
mkdir -p /var/lib/awx/public/static/js
mkdir -p /var/lib/awx/public/static/media
cp -r awx/ui/build/static/css/* /var/lib/awx/public/static/css
cp -r awx/ui/build/static/js/* /var/lib/awx/public/static/js
cp -r awx/ui/build/static/media/* /var/lib/awx/public/static/media
touch $@
ui-release: $(UI_BUILD_FLAG_FILE)
ui-devel: awx/ui/node_modules
@$(MAKE) -B $(UI_BUILD_FLAG_FILE)
mkdir -p /var/lib/awx/public/static/css
mkdir -p /var/lib/awx/public/static/js
mkdir -p /var/lib/awx/public/static/media
cp -r awx/ui/build/static/css/* /var/lib/awx/public/static/css
cp -r awx/ui/build/static/js/* /var/lib/awx/public/static/js
cp -r awx/ui/build/static/media/* /var/lib/awx/public/static/media
ui-devel-instrumented: awx/ui/node_modules
$(NPM_BIN) --prefix awx/ui --loglevel warn run start-instrumented
@@ -482,20 +498,20 @@ docker-compose-sources: .git/hooks/pre-commit
docker-compose: awx/projects docker-compose-sources
docker-compose -f tools/docker-compose/_sources/docker-compose.yml $(COMPOSE_OPTS) up $(COMPOSE_UP_OPTS) --remove-orphans
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml $(COMPOSE_OPTS) up $(COMPOSE_UP_OPTS) --remove-orphans
docker-compose-credential-plugins: awx/projects docker-compose-sources
echo -e "\033[0;31mTo generate a CyberArk Conjur API key: docker exec -it tools_conjur_1 conjurctl account create quick-start\033[0m"
docker-compose -f tools/docker-compose/_sources/docker-compose.yml -f tools/docker-credential-plugins-override.yml up --no-recreate awx_1 --remove-orphans
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml -f tools/docker-credential-plugins-override.yml up --no-recreate awx_1 --remove-orphans
docker-compose-test: awx/projects docker-compose-sources
docker-compose -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports awx_1 /bin/bash
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports awx_1 /bin/bash
docker-compose-runtest: awx/projects docker-compose-sources
docker-compose -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports awx_1 /start_tests.sh
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports awx_1 /start_tests.sh
docker-compose-build-swagger: awx/projects docker-compose-sources
docker-compose -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports --no-deps awx_1 /start_tests.sh swagger
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports --no-deps awx_1 /start_tests.sh swagger
SCHEMA_DIFF_BASE_BRANCH ?= devel
detect-schema-change: genschema
@@ -504,7 +520,7 @@ detect-schema-change: genschema
diff -u -b reference-schema.json schema.json
docker-compose-clean: awx/projects
docker-compose -f tools/docker-compose/_sources/docker-compose.yml rm -sf
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml rm -sf
docker-compose-container-group-clean:
@if [ -f "tools/docker-compose-minikube/_sources/minikube" ]; then \
@@ -532,10 +548,10 @@ docker-refresh: docker-clean docker-compose
## Docker Development Environment with Elastic Stack Connected
docker-compose-elk: awx/projects docker-compose-sources
docker-compose -f tools/docker-compose/_sources/docker-compose.yml -f tools/elastic/docker-compose.logstash-link.yml -f tools/elastic/docker-compose.elastic-override.yml up --no-recreate
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml -f tools/elastic/docker-compose.logstash-link.yml -f tools/elastic/docker-compose.elastic-override.yml up --no-recreate
docker-compose-cluster-elk: awx/projects docker-compose-sources
docker-compose -f tools/docker-compose/_sources/docker-compose.yml -f tools/elastic/docker-compose.logstash-link-cluster.yml -f tools/elastic/docker-compose.elastic-override.yml up --no-recreate
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml -f tools/elastic/docker-compose.logstash-link-cluster.yml -f tools/elastic/docker-compose.elastic-override.yml up --no-recreate
docker-compose-container-group:
MINIKUBE_CONTAINER_GROUP=true make docker-compose
@@ -598,7 +614,7 @@ messages:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
$(PYTHON) manage.py makemessages -l $(LANG) --keep-pot
$(PYTHON) manage.py makemessages -l en_us --keep-pot
print-%:
@echo $($*)

View File

@@ -67,7 +67,6 @@ else:
from django.db import connection
if HAS_DJANGO is True:
# See upgrade blocker note in requirements/README.md
try:
names_digest('foo', 'bar', 'baz', length=8)

View File

@@ -1,5 +1,4 @@
# Django
from django.conf import settings
from django.utils.translation import gettext_lazy as _
# Django REST Framework
@@ -9,6 +8,7 @@ from rest_framework import serializers
from awx.conf import fields, register, register_validate
from awx.api.fields import OAuth2ProviderField
from oauth2_provider.settings import oauth2_settings
from awx.sso.common import is_remote_auth_enabled
register(
@@ -96,22 +96,20 @@ register(
category=_('Authentication'),
category_slug='authentication',
)
register(
'ALLOW_METRICS_FOR_ANONYMOUS_USERS',
field_class=fields.BooleanField,
default=False,
label=_('Allow anonymous users to poll metrics'),
help_text=_('If true, anonymous users are allowed to poll metrics.'),
category=_('Authentication'),
category_slug='authentication',
)
def authentication_validate(serializer, attrs):
remote_auth_settings = [
'AUTH_LDAP_SERVER_URI',
'SOCIAL_AUTH_GOOGLE_OAUTH2_KEY',
'SOCIAL_AUTH_GITHUB_KEY',
'SOCIAL_AUTH_GITHUB_ORG_KEY',
'SOCIAL_AUTH_GITHUB_TEAM_KEY',
'SOCIAL_AUTH_SAML_ENABLED_IDPS',
'RADIUS_SERVER',
'TACACSPLUS_HOST',
]
if attrs.get('DISABLE_LOCAL_AUTH', False):
if not any(getattr(settings, s, None) for s in remote_auth_settings):
raise serializers.ValidationError(_("There are no remote authentication systems configured."))
if attrs.get('DISABLE_LOCAL_AUTH', False) and not is_remote_auth_enabled():
raise serializers.ValidationError(_("There are no remote authentication systems configured."))
return attrs

View File

@@ -80,7 +80,6 @@ class VerbatimField(serializers.Field):
class OAuth2ProviderField(fields.DictField):
default_error_messages = {'invalid_key_names': _('Invalid key names: {invalid_key_names}')}
valid_key_names = {'ACCESS_TOKEN_EXPIRE_SECONDS', 'AUTHORIZATION_CODE_EXPIRE_SECONDS', 'REFRESH_TOKEN_EXPIRE_SECONDS'}
child = fields.IntegerField(min_value=1)

View File

@@ -155,12 +155,11 @@ class FieldLookupBackend(BaseFilterBackend):
'search',
)
# A list of fields that we know can be filtered on without the possiblity
# A list of fields that we know can be filtered on without the possibility
# of introducing duplicates
NO_DUPLICATES_ALLOW_LIST = (CharField, IntegerField, BooleanField, TextField)
def get_fields_from_lookup(self, model, lookup):
if '__' in lookup and lookup.rsplit('__', 1)[-1] in self.SUPPORTED_LOOKUPS:
path, suffix = lookup.rsplit('__', 1)
else:
@@ -269,7 +268,7 @@ class FieldLookupBackend(BaseFilterBackend):
continue
# HACK: make `created` available via API for the Django User ORM model
# so it keep compatiblity with other objects which exposes the `created` attr.
# so it keep compatibility with other objects which exposes the `created` attr.
if queryset.model._meta.object_name == 'User' and key.startswith('created'):
key = key.replace('created', 'date_joined')

View File

@@ -28,7 +28,7 @@ from rest_framework import generics
from rest_framework.response import Response
from rest_framework import status
from rest_framework import views
from rest_framework.permissions import AllowAny
from rest_framework.permissions import IsAuthenticated
from rest_framework.renderers import StaticHTMLRenderer
from rest_framework.negotiation import DefaultContentNegotiation
@@ -135,7 +135,6 @@ def get_default_schema():
class APIView(views.APIView):
schema = get_default_schema()
versioning_class = URLPathVersioning
@@ -675,7 +674,7 @@ class SubListCreateAttachDetachAPIView(SubListCreateAPIView):
location = None
created = True
# Retrive the sub object (whether created or by ID).
# Retrieve the sub object (whether created or by ID).
sub = get_object_or_400(self.model, pk=sub_id)
# Verify we have permission to attach.
@@ -800,7 +799,6 @@ class RetrieveUpdateDestroyAPIView(RetrieveUpdateAPIView, DestroyAPIView):
class ResourceAccessList(ParentMixin, ListAPIView):
serializer_class = ResourceAccessListElementSerializer
ordering = ('username',)
@@ -823,9 +821,8 @@ def trigger_delayed_deep_copy(*args, **kwargs):
class CopyAPIView(GenericAPIView):
serializer_class = CopySerializer
permission_classes = (AllowAny,)
permission_classes = (IsAuthenticated,)
copy_return_serializer_class = None
new_in_330 = True
new_in_api_v2 = True

View File

@@ -128,7 +128,7 @@ class Metadata(metadata.SimpleMetadata):
# Special handling of notification configuration where the required properties
# are conditional on the type selected.
if field.field_name == 'notification_configuration':
for (notification_type_name, notification_tr_name, notification_type_class) in NotificationTemplate.NOTIFICATION_TYPES:
for notification_type_name, notification_tr_name, notification_type_class in NotificationTemplate.NOTIFICATION_TYPES:
field_info[notification_type_name] = notification_type_class.init_parameters
# Special handling of notification messages where the required properties
@@ -138,7 +138,7 @@ class Metadata(metadata.SimpleMetadata):
except (AttributeError, KeyError):
view_model = None
if view_model == NotificationTemplate and field.field_name == 'messages':
for (notification_type_name, notification_tr_name, notification_type_class) in NotificationTemplate.NOTIFICATION_TYPES:
for notification_type_name, notification_tr_name, notification_type_class in NotificationTemplate.NOTIFICATION_TYPES:
field_info[notification_type_name] = notification_type_class.default_messages
# Update type of fields returned...

View File

@@ -24,7 +24,6 @@ class DisabledPaginator(DjangoPaginator):
class Pagination(pagination.PageNumberPagination):
page_size_query_param = 'page_size'
max_page_size = settings.MAX_PAGE_SIZE
count_disabled = False

View File

@@ -22,7 +22,6 @@ class SurrogateEncoder(encoders.JSONEncoder):
class DefaultJSONRenderer(renderers.JSONRenderer):
encoder_class = SurrogateEncoder
@@ -61,7 +60,7 @@ class BrowsableAPIRenderer(renderers.BrowsableAPIRenderer):
delattr(renderer_context['view'], '_request')
def get_raw_data_form(self, data, view, method, request):
# Set a flag on the view to indiciate to the view/serializer that we're
# Set a flag on the view to indicate to the view/serializer that we're
# creating a raw data form for the browsable API. Store the original
# request method to determine how to populate the raw data form.
if request.method in {'OPTIONS', 'DELETE'}:
@@ -95,7 +94,6 @@ class BrowsableAPIRenderer(renderers.BrowsableAPIRenderer):
class PlainTextRenderer(renderers.BaseRenderer):
media_type = 'text/plain'
format = 'txt'
@@ -106,18 +104,15 @@ class PlainTextRenderer(renderers.BaseRenderer):
class DownloadTextRenderer(PlainTextRenderer):
format = "txt_download"
class AnsiTextRenderer(PlainTextRenderer):
media_type = 'text/plain'
format = 'ansi'
class AnsiDownloadRenderer(PlainTextRenderer):
format = "ansi_download"

View File

@@ -108,7 +108,6 @@ from awx.main.utils import (
extract_ansible_vars,
encrypt_dict,
prefetch_page_capabilities,
get_external_account,
truncate_stdout,
)
from awx.main.utils.filters import SmartFilter
@@ -124,6 +123,8 @@ from awx.api.fields import BooleanNullField, CharNullField, ChoiceNullField, Ver
# AWX Utils
from awx.api.validators import HostnameRegexValidator
from awx.sso.common import get_external_account
logger = logging.getLogger('awx.api.serializers')
# Fields that should be summarized regardless of object type.
@@ -200,7 +201,6 @@ def reverse_gfk(content_object, request):
class CopySerializer(serializers.Serializer):
name = serializers.CharField()
def validate(self, attrs):
@@ -432,7 +432,6 @@ class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetacl
continue
summary_fields[fk] = OrderedDict()
for field in related_fields:
fval = getattr(fkval, field, None)
if fval is None and field == 'type':
@@ -538,7 +537,7 @@ class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetacl
#
# This logic is to force rendering choice's on an uneditable field.
# Note: Consider expanding this rendering for more than just choices fields
# Note: This logic works in conjuction with
# Note: This logic works in conjunction with
if hasattr(model_field, 'choices') and model_field.choices:
was_editable = model_field.editable
model_field.editable = True
@@ -930,7 +929,6 @@ class UnifiedJobListSerializer(UnifiedJobSerializer):
class UnifiedJobStdoutSerializer(UnifiedJobSerializer):
result_stdout = serializers.SerializerMethodField()
class Meta:
@@ -944,7 +942,6 @@ class UnifiedJobStdoutSerializer(UnifiedJobSerializer):
class UserSerializer(BaseSerializer):
password = serializers.CharField(required=False, default='', write_only=True, help_text=_('Write-only field used to change the password.'))
ldap_dn = serializers.CharField(source='profile.ldap_dn', read_only=True)
external_account = serializers.SerializerMethodField(help_text=_('Set if the account is managed by an external service'))
@@ -991,23 +988,8 @@ class UserSerializer(BaseSerializer):
def _update_password(self, obj, new_password):
# For now we're not raising an error, just not saving password for
# users managed by LDAP who already have an unusable password set.
if getattr(settings, 'AUTH_LDAP_SERVER_URI', None):
try:
if obj.pk and obj.profile.ldap_dn and not obj.has_usable_password():
new_password = None
except AttributeError:
pass
if (
getattr(settings, 'SOCIAL_AUTH_GOOGLE_OAUTH2_KEY', None)
or getattr(settings, 'SOCIAL_AUTH_GITHUB_KEY', None)
or getattr(settings, 'SOCIAL_AUTH_GITHUB_ORG_KEY', None)
or getattr(settings, 'SOCIAL_AUTH_GITHUB_TEAM_KEY', None)
or getattr(settings, 'SOCIAL_AUTH_SAML_ENABLED_IDPS', None)
) and obj.social_auth.all():
new_password = None
if (getattr(settings, 'RADIUS_SERVER', None) or getattr(settings, 'TACACSPLUS_HOST', None)) and obj.enterprise_auth.all():
new_password = None
if new_password:
# Get external password will return something like ldap or enterprise or None if the user isn't external. We only want to allow a password update for a None option
if new_password and not self.get_external_account(obj):
obj.set_password(new_password)
obj.save(update_fields=['password'])
@@ -1104,7 +1086,6 @@ class UserActivityStreamSerializer(UserSerializer):
class BaseOAuth2TokenSerializer(BaseSerializer):
refresh_token = serializers.SerializerMethodField()
token = serializers.SerializerMethodField()
ALLOWED_SCOPES = ['read', 'write']
@@ -1222,7 +1203,6 @@ class UserPersonalTokenSerializer(BaseOAuth2TokenSerializer):
class OAuth2ApplicationSerializer(BaseSerializer):
show_capabilities = ['edit', 'delete']
class Meta:
@@ -1457,7 +1437,6 @@ class ExecutionEnvironmentSerializer(BaseSerializer):
class ProjectSerializer(UnifiedJobTemplateSerializer, ProjectOptionsSerializer):
status = serializers.ChoiceField(choices=Project.PROJECT_STATUS_CHOICES, read_only=True)
last_update_failed = serializers.BooleanField(read_only=True)
last_updated = serializers.DateTimeField(read_only=True)
@@ -1548,7 +1527,6 @@ class ProjectSerializer(UnifiedJobTemplateSerializer, ProjectOptionsSerializer):
class ProjectPlaybooksSerializer(ProjectSerializer):
playbooks = serializers.SerializerMethodField(help_text=_('Array of playbooks available within this project.'))
class Meta:
@@ -1566,7 +1544,6 @@ class ProjectPlaybooksSerializer(ProjectSerializer):
class ProjectInventoriesSerializer(ProjectSerializer):
inventory_files = serializers.ReadOnlyField(help_text=_('Array of inventory files and directories available within this project, ' 'not comprehensive.'))
class Meta:
@@ -1581,7 +1558,6 @@ class ProjectInventoriesSerializer(ProjectSerializer):
class ProjectUpdateViewSerializer(ProjectSerializer):
can_update = serializers.BooleanField(read_only=True)
class Meta:
@@ -1611,7 +1587,6 @@ class ProjectUpdateSerializer(UnifiedJobSerializer, ProjectOptionsSerializer):
class ProjectUpdateDetailSerializer(ProjectUpdateSerializer):
playbook_counts = serializers.SerializerMethodField(help_text=_('A count of all plays and tasks for the job run.'))
class Meta:
@@ -1634,7 +1609,6 @@ class ProjectUpdateListSerializer(ProjectUpdateSerializer, UnifiedJobListSeriali
class ProjectUpdateCancelSerializer(ProjectUpdateSerializer):
can_cancel = serializers.BooleanField(read_only=True)
class Meta:
@@ -1972,7 +1946,6 @@ class GroupSerializer(BaseSerializerWithVariables):
class GroupTreeSerializer(GroupSerializer):
children = serializers.SerializerMethodField()
class Meta:
@@ -2070,7 +2043,6 @@ class InventorySourceOptionsSerializer(BaseSerializer):
class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOptionsSerializer):
status = serializers.ChoiceField(choices=InventorySource.INVENTORY_SOURCE_STATUS_CHOICES, read_only=True)
last_update_failed = serializers.BooleanField(read_only=True)
last_updated = serializers.DateTimeField(read_only=True)
@@ -2215,7 +2187,6 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
class InventorySourceUpdateSerializer(InventorySourceSerializer):
can_update = serializers.BooleanField(read_only=True)
class Meta:
@@ -2232,7 +2203,6 @@ class InventorySourceUpdateSerializer(InventorySourceSerializer):
class InventoryUpdateSerializer(UnifiedJobSerializer, InventorySourceOptionsSerializer):
custom_virtualenv = serializers.ReadOnlyField()
class Meta:
@@ -2273,7 +2243,6 @@ class InventoryUpdateSerializer(UnifiedJobSerializer, InventorySourceOptionsSeri
class InventoryUpdateDetailSerializer(InventoryUpdateSerializer):
source_project = serializers.SerializerMethodField(help_text=_('The project used for this job.'), method_name='get_source_project_id')
class Meta:
@@ -2324,7 +2293,6 @@ class InventoryUpdateListSerializer(InventoryUpdateSerializer, UnifiedJobListSer
class InventoryUpdateCancelSerializer(InventoryUpdateSerializer):
can_cancel = serializers.BooleanField(read_only=True)
class Meta:
@@ -2682,7 +2650,6 @@ class CredentialSerializer(BaseSerializer):
class CredentialSerializerCreate(CredentialSerializer):
user = serializers.PrimaryKeyRelatedField(
queryset=User.objects.all(),
required=False,
@@ -3037,7 +3004,6 @@ class JobTemplateWithSpecSerializer(JobTemplateSerializer):
class JobSerializer(UnifiedJobSerializer, JobOptionsSerializer):
passwords_needed_to_start = serializers.ReadOnlyField()
artifacts = serializers.SerializerMethodField()
@@ -3120,7 +3086,6 @@ class JobSerializer(UnifiedJobSerializer, JobOptionsSerializer):
class JobDetailSerializer(JobSerializer):
playbook_counts = serializers.SerializerMethodField(help_text=_('A count of all plays and tasks for the job run.'))
custom_virtualenv = serializers.ReadOnlyField()
@@ -3138,7 +3103,6 @@ class JobDetailSerializer(JobSerializer):
class JobCancelSerializer(BaseSerializer):
can_cancel = serializers.BooleanField(read_only=True)
class Meta:
@@ -3147,7 +3111,6 @@ class JobCancelSerializer(BaseSerializer):
class JobRelaunchSerializer(BaseSerializer):
passwords_needed_to_start = serializers.SerializerMethodField()
retry_counts = serializers.SerializerMethodField()
hosts = serializers.ChoiceField(
@@ -3207,7 +3170,6 @@ class JobRelaunchSerializer(BaseSerializer):
class JobCreateScheduleSerializer(LabelsListMixin, BaseSerializer):
can_schedule = serializers.SerializerMethodField()
prompts = serializers.SerializerMethodField()
@@ -3333,7 +3295,6 @@ class AdHocCommandDetailSerializer(AdHocCommandSerializer):
class AdHocCommandCancelSerializer(AdHocCommandSerializer):
can_cancel = serializers.BooleanField(read_only=True)
class Meta:
@@ -3372,7 +3333,6 @@ class SystemJobTemplateSerializer(UnifiedJobTemplateSerializer):
class SystemJobSerializer(UnifiedJobSerializer):
result_stdout = serializers.SerializerMethodField()
class Meta:
@@ -3399,7 +3359,6 @@ class SystemJobSerializer(UnifiedJobSerializer):
class SystemJobCancelSerializer(SystemJobSerializer):
can_cancel = serializers.BooleanField(read_only=True)
class Meta:
@@ -3564,7 +3523,6 @@ class WorkflowJobListSerializer(WorkflowJobSerializer, UnifiedJobListSerializer)
class WorkflowJobCancelSerializer(WorkflowJobSerializer):
can_cancel = serializers.BooleanField(read_only=True)
class Meta:
@@ -3578,7 +3536,6 @@ class WorkflowApprovalViewSerializer(UnifiedJobSerializer):
class WorkflowApprovalSerializer(UnifiedJobSerializer):
can_approve_or_deny = serializers.SerializerMethodField()
approval_expiration = serializers.SerializerMethodField()
timed_out = serializers.ReadOnlyField()
@@ -3973,7 +3930,6 @@ class JobHostSummarySerializer(BaseSerializer):
class JobEventSerializer(BaseSerializer):
event_display = serializers.CharField(source='get_event_display2', read_only=True)
event_level = serializers.IntegerField(read_only=True)
@@ -4027,7 +3983,7 @@ class JobEventSerializer(BaseSerializer):
# Show full stdout for playbook_on_* events.
if obj and obj.event.startswith('playbook_on'):
return data
# If the view logic says to not trunctate (request was to the detail view or a param was used)
# If the view logic says to not truncate (request was to the detail view or a param was used)
if self.context.get('no_truncate', False):
return data
max_bytes = settings.EVENT_STDOUT_MAX_BYTES_DISPLAY
@@ -4058,7 +4014,7 @@ class ProjectUpdateEventSerializer(JobEventSerializer):
# raw SCM URLs in their stdout (which *could* contain passwords)
# attempt to detect and filter HTTP basic auth passwords in the stdout
# of these types of events
if obj.event_data.get('task_action') in ('git', 'svn'):
if obj.event_data.get('task_action') in ('git', 'svn', 'ansible.builtin.git', 'ansible.builtin.svn'):
try:
return json.loads(UriCleaner.remove_sensitive(json.dumps(obj.event_data)))
except Exception:
@@ -4069,7 +4025,6 @@ class ProjectUpdateEventSerializer(JobEventSerializer):
class AdHocCommandEventSerializer(BaseSerializer):
event_display = serializers.CharField(source='get_event_display', read_only=True)
class Meta:
@@ -4103,7 +4058,7 @@ class AdHocCommandEventSerializer(BaseSerializer):
def to_representation(self, obj):
data = super(AdHocCommandEventSerializer, self).to_representation(obj)
# If the view logic says to not trunctate (request was to the detail view or a param was used)
# If the view logic says to not truncate (request was to the detail view or a param was used)
if self.context.get('no_truncate', False):
return data
max_bytes = settings.EVENT_STDOUT_MAX_BYTES_DISPLAY
@@ -4351,7 +4306,6 @@ class JobLaunchSerializer(BaseSerializer):
class WorkflowJobLaunchSerializer(BaseSerializer):
can_start_without_user_input = serializers.BooleanField(read_only=True)
defaults = serializers.SerializerMethodField()
variables_needed_to_start = serializers.ReadOnlyField()
@@ -4408,7 +4362,6 @@ class WorkflowJobLaunchSerializer(BaseSerializer):
return False
def get_defaults(self, obj):
defaults_dict = {}
for field_name in WorkflowJobTemplate.get_ask_mapping().keys():
if field_name == 'inventory':
@@ -4425,7 +4378,6 @@ class WorkflowJobLaunchSerializer(BaseSerializer):
return dict(name=obj.name, id=obj.id, description=obj.description)
def validate(self, attrs):
template = self.instance
accepted, rejected, errors = template._accept_or_ignore_job_kwargs(**attrs)
@@ -4666,7 +4618,6 @@ class NotificationTemplateSerializer(BaseSerializer):
class NotificationSerializer(BaseSerializer):
body = serializers.SerializerMethodField(help_text=_('Notification body'))
class Meta:
@@ -4800,7 +4751,7 @@ class ScheduleSerializer(LaunchConfigurationBaseSerializer, SchedulePreviewSeria
),
)
until = serializers.SerializerMethodField(
help_text=_('The date this schedule will end. This field is computed from the RRULE. If the schedule does not end an emptry string will be returned'),
help_text=_('The date this schedule will end. This field is computed from the RRULE. If the schedule does not end an empty string will be returned'),
)
class Meta:
@@ -5038,7 +4989,6 @@ class InstanceHealthCheckSerializer(BaseSerializer):
class InstanceGroupSerializer(BaseSerializer):
show_capabilities = ['edit', 'delete']
capacity = serializers.SerializerMethodField()
consumed_capacity = serializers.SerializerMethodField()
@@ -5225,7 +5175,6 @@ class InstanceGroupSerializer(BaseSerializer):
class ActivityStreamSerializer(BaseSerializer):
changes = serializers.SerializerMethodField()
object_association = serializers.SerializerMethodField(help_text=_("When present, shows the field name of the role or relationship that changed."))
object_type = serializers.SerializerMethodField(help_text=_("When present, shows the model on which the role or relationship was defined."))

View File

@@ -3,7 +3,7 @@ Make a GET request to this resource to retrieve aggregate statistics about inven
Including fetching the number of total hosts tracked by Tower over an amount of time and the current success or
failed status of hosts which have run jobs within an Inventory.
## Parmeters and Filtering
## Parameters and Filtering
The `period` of the data can be adjusted with:
@@ -24,7 +24,7 @@ Data about the number of hosts will be returned in the following format:
Each element contains an epoch timestamp represented in seconds and a numerical value indicating
the number of hosts that exist at a given moment
Data about failed and successfull hosts by inventory will be given as:
Data about failed and successful hosts by inventory will be given as:
{
"sources": [

View File

@@ -2,7 +2,7 @@
Make a GET request to this resource to retrieve aggregate statistics about job runs suitable for graphing.
## Parmeters and Filtering
## Parameters and Filtering
The `period` of the data can be adjusted with:

View File

@@ -18,7 +18,7 @@ inventory sources:
* `inventory_update`: ID of the inventory update job that was started.
(integer, read-only)
* `project_update`: ID of the project update job that was started if this inventory source is an SCM source.
(interger, read-only, optional)
(integer, read-only, optional)
Note: All manual inventory sources (source="") will be ignored by the update_inventory_sources endpoint. This endpoint will not update inventory sources for Smart Inventories.

View File

@@ -33,7 +33,6 @@ class HostnameRegexValidator(RegexValidator):
return f"regex={self.regex}, message={self.message}, code={self.code}, inverse_match={self.inverse_match}, flags={self.flags}"
def __validate(self, value):
if ' ' in value:
return False, ValidationError("whitespaces in hostnames are illegal")

File diff suppressed because it is too large Load Diff

View File

@@ -25,6 +25,7 @@ from rest_framework import status
# Red Hat has an OID namespace (RHANANA). Receptor has its own designation under that.
RECEPTOR_OID = "1.3.6.1.4.1.2312.19.1"
# generate install bundle for the instance
# install bundle directory structure
# ├── install_receptor.yml (playbook)
@@ -40,7 +41,6 @@ RECEPTOR_OID = "1.3.6.1.4.1.2312.19.1"
# │ └── work-public-key.pem
# └── requirements.yml
class InstanceInstallBundle(GenericAPIView):
name = _('Install Bundle')
model = models.Instance
serializer_class = serializers.InstanceSerializer

View File

@@ -46,7 +46,6 @@ logger = logging.getLogger('awx.api.views.organization')
class InventoryUpdateEventsList(SubListAPIView):
model = InventoryUpdateEvent
serializer_class = InventoryUpdateEventSerializer
parent_model = InventoryUpdate
@@ -66,13 +65,11 @@ class InventoryUpdateEventsList(SubListAPIView):
class InventoryList(ListCreateAPIView):
model = Inventory
serializer_class = InventorySerializer
class InventoryDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
model = Inventory
serializer_class = InventorySerializer
@@ -98,7 +95,6 @@ class InventoryDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIVie
class InventoryActivityStreamList(SubListAPIView):
model = ActivityStream
serializer_class = ActivityStreamSerializer
parent_model = Inventory
@@ -113,7 +109,6 @@ class InventoryActivityStreamList(SubListAPIView):
class InventoryInstanceGroupsList(SubListAttachDetachAPIView):
model = InstanceGroup
serializer_class = InstanceGroupSerializer
parent_model = Inventory
@@ -121,13 +116,11 @@ class InventoryInstanceGroupsList(SubListAttachDetachAPIView):
class InventoryAccessList(ResourceAccessList):
model = User # needs to be User for AccessLists's
parent_model = Inventory
class InventoryObjectRolesList(SubListAPIView):
model = Role
serializer_class = RoleSerializer
parent_model = Inventory
@@ -140,7 +133,6 @@ class InventoryObjectRolesList(SubListAPIView):
class InventoryJobTemplateList(SubListAPIView):
model = JobTemplate
serializer_class = JobTemplateSerializer
parent_model = Inventory
@@ -154,11 +146,9 @@ class InventoryJobTemplateList(SubListAPIView):
class InventoryLabelList(LabelSubListCreateAttachDetachView):
parent_model = Inventory
class InventoryCopy(CopyAPIView):
model = Inventory
copy_return_serializer_class = InventorySerializer

View File

@@ -59,13 +59,11 @@ class LabelSubListCreateAttachDetachView(SubListCreateAttachDetachAPIView):
class LabelDetail(RetrieveUpdateAPIView):
model = Label
serializer_class = LabelSerializer
class LabelList(ListCreateAPIView):
name = _("Labels")
model = Label
serializer_class = LabelSerializer

View File

@@ -10,13 +10,11 @@ from awx.main.models import InstanceLink, Instance
class MeshVisualizer(APIView):
name = _("Mesh Visualizer")
permission_classes = (IsSystemAdminOrAuditor,)
swagger_topic = "System Configuration"
def get(self, request, format=None):
data = {
'nodes': InstanceNodeSerializer(Instance.objects.all(), many=True).data,
'links': InstanceLinkSerializer(InstanceLink.objects.select_related('target', 'source'), many=True).data,

View File

@@ -5,9 +5,11 @@
import logging
# Django
from django.conf import settings
from django.utils.translation import gettext_lazy as _
# Django REST Framework
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
from rest_framework.exceptions import PermissionDenied
@@ -25,15 +27,19 @@ logger = logging.getLogger('awx.analytics')
class MetricsView(APIView):
name = _('Metrics')
swagger_topic = 'Metrics'
renderer_classes = [renderers.PlainTextRenderer, renderers.PrometheusJSONRenderer, renderers.BrowsableAPIRenderer]
def initialize_request(self, request, *args, **kwargs):
if settings.ALLOW_METRICS_FOR_ANONYMOUS_USERS:
self.permission_classes = (AllowAny,)
return super(APIView, self).initialize_request(request, *args, **kwargs)
def get(self, request):
'''Show Metrics Details'''
if request.user.is_superuser or request.user.is_system_auditor:
if settings.ALLOW_METRICS_FOR_ANONYMOUS_USERS or request.user.is_superuser or request.user.is_system_auditor:
metrics_to_show = ''
if not request.query_params.get('subsystemonly', "0") == "1":
metrics_to_show += metrics().decode('UTF-8')

View File

@@ -16,7 +16,7 @@ from rest_framework import status
from awx.main.constants import ACTIVE_STATES
from awx.main.utils import get_object_or_400
from awx.main.models.ha import Instance, InstanceGroup
from awx.main.models.ha import Instance, InstanceGroup, schedule_policy_task
from awx.main.models.organization import Team
from awx.main.models.projects import Project
from awx.main.models.inventory import Inventory
@@ -107,6 +107,11 @@ class InstanceGroupMembershipMixin(object):
if inst_name in ig_obj.policy_instance_list:
ig_obj.policy_instance_list.pop(ig_obj.policy_instance_list.index(inst_name))
ig_obj.save(update_fields=['policy_instance_list'])
# sometimes removing an instance has a non-obvious consequence
# this is almost always true if policy_instance_percentage or _minimum is non-zero
# after removing a single instance, the other memberships need to be re-balanced
schedule_policy_task()
return response

View File

@@ -58,7 +58,6 @@ logger = logging.getLogger('awx.api.views.organization')
class OrganizationList(OrganizationCountsMixin, ListCreateAPIView):
model = Organization
serializer_class = OrganizationSerializer
@@ -70,7 +69,6 @@ class OrganizationList(OrganizationCountsMixin, ListCreateAPIView):
class OrganizationDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
model = Organization
serializer_class = OrganizationSerializer
@@ -106,7 +104,6 @@ class OrganizationDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPI
class OrganizationInventoriesList(SubListAPIView):
model = Inventory
serializer_class = InventorySerializer
parent_model = Organization
@@ -114,7 +111,6 @@ class OrganizationInventoriesList(SubListAPIView):
class OrganizationUsersList(BaseUsersList):
model = User
serializer_class = UserSerializer
parent_model = Organization
@@ -123,7 +119,6 @@ class OrganizationUsersList(BaseUsersList):
class OrganizationAdminsList(BaseUsersList):
model = User
serializer_class = UserSerializer
parent_model = Organization
@@ -132,7 +127,6 @@ class OrganizationAdminsList(BaseUsersList):
class OrganizationProjectsList(SubListCreateAPIView):
model = Project
serializer_class = ProjectSerializer
parent_model = Organization
@@ -140,7 +134,6 @@ class OrganizationProjectsList(SubListCreateAPIView):
class OrganizationExecutionEnvironmentsList(SubListCreateAttachDetachAPIView):
model = ExecutionEnvironment
serializer_class = ExecutionEnvironmentSerializer
parent_model = Organization
@@ -150,7 +143,6 @@ class OrganizationExecutionEnvironmentsList(SubListCreateAttachDetachAPIView):
class OrganizationJobTemplatesList(SubListCreateAPIView):
model = JobTemplate
serializer_class = JobTemplateSerializer
parent_model = Organization
@@ -158,7 +150,6 @@ class OrganizationJobTemplatesList(SubListCreateAPIView):
class OrganizationWorkflowJobTemplatesList(SubListCreateAPIView):
model = WorkflowJobTemplate
serializer_class = WorkflowJobTemplateSerializer
parent_model = Organization
@@ -166,7 +157,6 @@ class OrganizationWorkflowJobTemplatesList(SubListCreateAPIView):
class OrganizationTeamsList(SubListCreateAttachDetachAPIView):
model = Team
serializer_class = TeamSerializer
parent_model = Organization
@@ -175,7 +165,6 @@ class OrganizationTeamsList(SubListCreateAttachDetachAPIView):
class OrganizationActivityStreamList(SubListAPIView):
model = ActivityStream
serializer_class = ActivityStreamSerializer
parent_model = Organization
@@ -184,7 +173,6 @@ class OrganizationActivityStreamList(SubListAPIView):
class OrganizationNotificationTemplatesList(SubListCreateAttachDetachAPIView):
model = NotificationTemplate
serializer_class = NotificationTemplateSerializer
parent_model = Organization
@@ -193,34 +181,28 @@ class OrganizationNotificationTemplatesList(SubListCreateAttachDetachAPIView):
class OrganizationNotificationTemplatesAnyList(SubListCreateAttachDetachAPIView):
model = NotificationTemplate
serializer_class = NotificationTemplateSerializer
parent_model = Organization
class OrganizationNotificationTemplatesStartedList(OrganizationNotificationTemplatesAnyList):
relationship = 'notification_templates_started'
class OrganizationNotificationTemplatesErrorList(OrganizationNotificationTemplatesAnyList):
relationship = 'notification_templates_error'
class OrganizationNotificationTemplatesSuccessList(OrganizationNotificationTemplatesAnyList):
relationship = 'notification_templates_success'
class OrganizationNotificationTemplatesApprovalList(OrganizationNotificationTemplatesAnyList):
relationship = 'notification_templates_approvals'
class OrganizationInstanceGroupsList(SubListAttachDetachAPIView):
model = InstanceGroup
serializer_class = InstanceGroupSerializer
parent_model = Organization
@@ -228,7 +210,6 @@ class OrganizationInstanceGroupsList(SubListAttachDetachAPIView):
class OrganizationGalaxyCredentialsList(SubListAttachDetachAPIView):
model = Credential
serializer_class = CredentialSerializer
parent_model = Organization
@@ -240,13 +221,11 @@ class OrganizationGalaxyCredentialsList(SubListAttachDetachAPIView):
class OrganizationAccessList(ResourceAccessList):
model = User # needs to be User for AccessLists's
parent_model = Organization
class OrganizationObjectRolesList(SubListAPIView):
model = Role
serializer_class = RoleSerializer
parent_model = Organization

View File

@@ -36,7 +36,6 @@ logger = logging.getLogger('awx.api.views.root')
class ApiRootView(APIView):
permission_classes = (AllowAny,)
name = _('REST API')
versioning_class = None
@@ -59,7 +58,6 @@ class ApiRootView(APIView):
class ApiOAuthAuthorizationRootView(APIView):
permission_classes = (AllowAny,)
name = _("API OAuth 2 Authorization Root")
versioning_class = None
@@ -74,7 +72,6 @@ class ApiOAuthAuthorizationRootView(APIView):
class ApiVersionRootView(APIView):
permission_classes = (AllowAny,)
swagger_topic = 'Versioning'
@@ -172,7 +169,6 @@ class ApiV2PingView(APIView):
class ApiV2SubscriptionView(APIView):
permission_classes = (IsAuthenticated,)
name = _('Subscriptions')
swagger_topic = 'System Configuration'
@@ -212,7 +208,6 @@ class ApiV2SubscriptionView(APIView):
class ApiV2AttachView(APIView):
permission_classes = (IsAuthenticated,)
name = _('Attach Subscription')
swagger_topic = 'System Configuration'
@@ -230,7 +225,6 @@ class ApiV2AttachView(APIView):
user = getattr(settings, 'SUBSCRIPTIONS_USERNAME', None)
pw = getattr(settings, 'SUBSCRIPTIONS_PASSWORD', None)
if pool_id and user and pw:
data = request.data.copy()
try:
with set_environ(**settings.AWX_TASK_ENV):
@@ -258,7 +252,6 @@ class ApiV2AttachView(APIView):
class ApiV2ConfigView(APIView):
permission_classes = (IsAuthenticated,)
name = _('Configuration')
swagger_topic = 'System Configuration'

View File

@@ -8,7 +8,6 @@ from django.utils.translation import gettext_lazy as _
class ConfConfig(AppConfig):
name = 'awx.conf'
verbose_name = _('Configuration')
@@ -16,7 +15,6 @@ class ConfConfig(AppConfig):
self.module.autodiscover()
if not set(sys.argv) & {'migrate', 'check_migrations'}:
from .settings import SettingsWrapper
SettingsWrapper.initialize()

View File

@@ -21,7 +21,7 @@ logger = logging.getLogger('awx.conf.fields')
# Use DRF fields to convert/validate settings:
# - to_representation(obj) should convert a native Python object to a primitive
# serializable type. This primitive type will be what is presented in the API
# and stored in the JSON field in the datbase.
# and stored in the JSON field in the database.
# - to_internal_value(data) should convert the primitive type back into the
# appropriate Python type to be used in settings.
@@ -47,7 +47,6 @@ class IntegerField(IntegerField):
class StringListField(ListField):
child = CharField()
def to_representation(self, value):
@@ -57,7 +56,6 @@ class StringListField(ListField):
class StringListBooleanField(ListField):
default_error_messages = {'type_error': _('Expected None, True, False, a string or list of strings but got {input_type} instead.')}
child = CharField()
@@ -96,7 +94,6 @@ class StringListBooleanField(ListField):
class StringListPathField(StringListField):
default_error_messages = {'type_error': _('Expected list of strings but got {input_type} instead.'), 'path_error': _('{path} is not a valid path choice.')}
def to_internal_value(self, paths):
@@ -126,7 +123,6 @@ class StringListIsolatedPathField(StringListField):
}
def to_internal_value(self, paths):
if isinstance(paths, (list, tuple)):
for p in paths:
if not isinstance(p, str):

View File

@@ -8,7 +8,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [migrations.swappable_dependency(settings.AUTH_USER_MODEL)]
operations = [

View File

@@ -48,7 +48,6 @@ def revert_tower_settings(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [('conf', '0001_initial'), ('main', '0004_squashed_v310_release')]
run_before = [('main', '0005_squashed_v310_v313_updates')]

View File

@@ -7,7 +7,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [('conf', '0002_v310_copy_tower_settings')]
operations = [migrations.AlterField(model_name='setting', name='value', field=awx.main.fields.JSONBlob(null=True))]

View File

@@ -5,7 +5,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [('conf', '0003_v310_JSONField_changes')]
operations = [

View File

@@ -15,7 +15,6 @@ def reverse_copy_session_settings(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [('conf', '0004_v320_reencrypt')]
operations = [migrations.RunPython(copy_session_settings, reverse_copy_session_settings)]

View File

@@ -8,7 +8,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [('conf', '0005_v330_rename_two_session_settings')]
operations = [migrations.RunPython(fill_ldap_group_type_params)]

View File

@@ -9,7 +9,6 @@ def copy_allowed_ips(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [('conf', '0006_v331_ldap_group_type')]
operations = [migrations.RunPython(copy_allowed_ips)]

View File

@@ -14,7 +14,6 @@ def _noop(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [('conf', '0007_v380_rename_more_settings')]
operations = [migrations.RunPython(clear_old_license, _noop), migrations.RunPython(prefill_rh_credentials, _noop)]

View File

@@ -10,7 +10,6 @@ def rename_proot_settings(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [('conf', '0008_subscriptions')]
operations = [migrations.RunPython(rename_proot_settings)]

View File

@@ -1,7 +1,11 @@
import inspect
from django.conf import settings
from django.utils.timezone import now
import logging
logger = logging.getLogger('awx.conf.migrations')
def fill_ldap_group_type_params(apps, schema_editor):
@@ -15,7 +19,7 @@ def fill_ldap_group_type_params(apps, schema_editor):
entry = qs[0]
group_type_params = entry.value
else:
entry = Setting(key='AUTH_LDAP_GROUP_TYPE_PARAMS', value=group_type_params, created=now(), modified=now())
return # for new installs we prefer to use the default value
init_attrs = set(inspect.getfullargspec(group_type.__init__).args[1:])
for k in list(group_type_params.keys()):
@@ -23,4 +27,5 @@ def fill_ldap_group_type_params(apps, schema_editor):
del group_type_params[k]
entry.value = group_type_params
logger.warning(f'Migration updating AUTH_LDAP_GROUP_TYPE_PARAMS with value {entry.value}')
entry.save()

View File

@@ -10,7 +10,6 @@ __all__ = ['rename_setting']
def rename_setting(apps, schema_editor, old_key, new_key):
old_setting = None
Setting = apps.get_model('conf', 'Setting')
if Setting.objects.filter(key=new_key).exists() or hasattr(settings, new_key):

View File

@@ -17,7 +17,6 @@ __all__ = ['Setting']
class Setting(CreatedModifiedModel):
key = models.CharField(max_length=255)
value = JSONBlob(null=True)
user = prevent_search(models.ForeignKey('auth.User', related_name='settings', default=None, null=True, editable=False, on_delete=models.CASCADE))

View File

@@ -104,7 +104,6 @@ def filter_sensitive(registry, key, value):
class TransientSetting(object):
__slots__ = ('pk', 'value')
def __init__(self, pk, value):

View File

@@ -0,0 +1,25 @@
import pytest
from awx.conf.migrations._ldap_group_type import fill_ldap_group_type_params
from awx.conf.models import Setting
from django.apps import apps
@pytest.mark.django_db
def test_fill_group_type_params_no_op():
fill_ldap_group_type_params(apps, 'dont-use-me')
assert Setting.objects.count() == 0
@pytest.mark.django_db
def test_keep_old_setting_with_default_value():
Setting.objects.create(key='AUTH_LDAP_GROUP_TYPE', value={'name_attr': 'cn', 'member_attr': 'member'})
fill_ldap_group_type_params(apps, 'dont-use-me')
assert Setting.objects.count() == 1
s = Setting.objects.first()
assert s.value == {'name_attr': 'cn', 'member_attr': 'member'}
# NOTE: would be good to test the removal of attributes by migration
# but this requires fighting with the validator and is not done here

View File

@@ -5,7 +5,6 @@ from awx.conf.fields import StringListBooleanField, StringListPathField, ListTup
class TestStringListBooleanField:
FIELD_VALUES = [
("hello", "hello"),
(("a", "b"), ["a", "b"]),
@@ -53,7 +52,6 @@ class TestStringListBooleanField:
class TestListTuplesField:
FIELD_VALUES = [([('a', 'b'), ('abc', '123')], [("a", "b"), ("abc", "123")])]
FIELD_VALUES_INVALID = [("abc", type("abc")), ([('a', 'b', 'c'), ('abc', '123', '456')], type(('a',))), (['a', 'b'], type('a')), (123, type(123))]
@@ -73,7 +71,6 @@ class TestListTuplesField:
class TestStringListPathField:
FIELD_VALUES = [
((".", "..", "/"), [".", "..", "/"]),
(("/home",), ["/home"]),

View File

@@ -36,7 +36,6 @@ SettingCategory = collections.namedtuple('SettingCategory', ('url', 'slug', 'nam
class SettingCategoryList(ListAPIView):
model = Setting # Not exactly, but needed for the view.
serializer_class = SettingCategorySerializer
filter_backends = []
@@ -58,7 +57,6 @@ class SettingCategoryList(ListAPIView):
class SettingSingletonDetail(RetrieveUpdateDestroyAPIView):
model = Setting # Not exactly, but needed for the view.
serializer_class = SettingSingletonSerializer
filter_backends = []
@@ -146,7 +144,6 @@ class SettingSingletonDetail(RetrieveUpdateDestroyAPIView):
class SettingLoggingTest(GenericAPIView):
name = _('Logging Connectivity Test')
model = Setting
serializer_class = SettingSingletonSerializer
@@ -183,7 +180,7 @@ class SettingLoggingTest(GenericAPIView):
if not port:
return Response({'error': 'Port required for ' + protocol}, status=status.HTTP_400_BAD_REQUEST)
else:
# if http/https by this point, domain is reacheable
# if http/https by this point, domain is reachable
return Response(status=status.HTTP_202_ACCEPTED)
if protocol == 'udp':

View File

@@ -1972,7 +1972,7 @@ msgid ""
"HTTP headers and meta keys to search to determine remote host name or IP. "
"Add additional items to this list, such as \"HTTP_X_FORWARDED_FOR\", if "
"behind a reverse proxy. See the \"Proxy Support\" section of the "
"Adminstrator guide for more details."
"Administrator guide for more details."
msgstr ""
#: awx/main/conf.py:85
@@ -2457,7 +2457,7 @@ msgid ""
msgstr ""
#: awx/main/conf.py:631
msgid "Maximum disk persistance for external log aggregation (in GB)"
msgid "Maximum disk persistence for external log aggregation (in GB)"
msgstr ""
#: awx/main/conf.py:633
@@ -2548,7 +2548,7 @@ msgid "Enable"
msgstr ""
#: awx/main/constants.py:27
msgid "Doas"
msgid "Does"
msgstr ""
#: awx/main/constants.py:28
@@ -4801,7 +4801,7 @@ msgstr ""
#: awx/main/models/workflow.py:251
msgid ""
"An identifier coresponding to the workflow job template node that this node "
"An identifier corresponding to the workflow job template node that this node "
"was created from."
msgstr ""
@@ -5521,7 +5521,7 @@ msgstr ""
#: awx/sso/conf.py:606
msgid ""
"Extra arguments for Google OAuth2 login. You can restrict it to only allow a "
"single domain to authenticate, even if the user is logged in with multple "
"single domain to authenticate, even if the user is logged in with multiple "
"Google accounts. Refer to the documentation for more detail."
msgstr ""
@@ -5905,7 +5905,7 @@ msgstr ""
#: awx/sso/conf.py:1290
msgid ""
"Create a keypair to use as a service provider (SP) and include the "
"Create a key pair to use as a service provider (SP) and include the "
"certificate content here."
msgstr ""
@@ -5915,7 +5915,7 @@ msgstr ""
#: awx/sso/conf.py:1302
msgid ""
"Create a keypair to use as a service provider (SP) and include the private "
"Create a key pair to use as a service provider (SP) and include the private "
"key content here."
msgstr ""

View File

@@ -1971,7 +1971,7 @@ msgid ""
"HTTP headers and meta keys to search to determine remote host name or IP. "
"Add additional items to this list, such as \"HTTP_X_FORWARDED_FOR\", if "
"behind a reverse proxy. See the \"Proxy Support\" section of the "
"Adminstrator guide for more details."
"Administrator guide for more details."
msgstr "Los encabezados HTTP y las llaves de activación para buscar y determinar el nombre de host remoto o IP. Añada elementos adicionales a esta lista, como \"HTTP_X_FORWARDED_FOR\", si está detrás de un proxy inverso. Consulte la sección \"Soporte de proxy\" de la guía del adminstrador para obtener más información."
#: awx/main/conf.py:85
@@ -4804,7 +4804,7 @@ msgstr "Indica que un trabajo no se creará cuando es sea True. La semántica de
#: awx/main/models/workflow.py:251
msgid ""
"An identifier coresponding to the workflow job template node that this node "
"An identifier corresponding to the workflow job template node that this node "
"was created from."
msgstr "Un identificador que corresponde al nodo de plantilla de tarea del flujo de trabajo a partir del cual se creó este nodo."
@@ -5526,7 +5526,7 @@ msgstr "Argumentos adicionales para Google OAuth2"
#: awx/sso/conf.py:606
msgid ""
"Extra arguments for Google OAuth2 login. You can restrict it to only allow a "
"single domain to authenticate, even if the user is logged in with multple "
"single domain to authenticate, even if the user is logged in with multiple "
"Google accounts. Refer to the documentation for more detail."
msgstr "Argumentos adicionales para el inicio de sesión en Google OAuth2. Puede limitarlo para permitir la autenticación de un solo dominio, incluso si el usuario ha iniciado sesión con varias cuentas de Google. Consulte la documentación para obtener información detallada."
@@ -5910,7 +5910,7 @@ msgstr "Certificado público del proveedor de servicio SAML"
#: awx/sso/conf.py:1290
msgid ""
"Create a keypair to use as a service provider (SP) and include the "
"Create a key pair to use as a service provider (SP) and include the "
"certificate content here."
msgstr "Crear un par de claves para usar como proveedor de servicio (SP) e incluir el contenido del certificado aquí."
@@ -5920,7 +5920,7 @@ msgstr "Clave privada del proveedor de servicio SAML"
#: awx/sso/conf.py:1302
msgid ""
"Create a keypair to use as a service provider (SP) and include the private "
"Create a key pair to use as a service provider (SP) and include the private "
"key content here."
msgstr "Crear un par de claves para usar como proveedor de servicio (SP) e incluir el contenido de la clave privada aquí."

View File

@@ -561,7 +561,6 @@ class NotificationAttachMixin(BaseAccess):
class InstanceAccess(BaseAccess):
model = Instance
prefetch_related = ('rampart_groups',)
@@ -579,7 +578,6 @@ class InstanceAccess(BaseAccess):
return super(InstanceAccess, self).can_unattach(obj, sub_obj, relationship, relationship, data=data)
def can_add(self, data):
return self.user.is_superuser
def can_change(self, obj, data):
@@ -590,7 +588,6 @@ class InstanceAccess(BaseAccess):
class InstanceGroupAccess(BaseAccess):
model = InstanceGroup
prefetch_related = ('instances',)
@@ -1030,7 +1027,9 @@ class GroupAccess(BaseAccess):
return Group.objects.filter(inventory__in=Inventory.accessible_pk_qs(self.user, 'read_role'))
def can_add(self, data):
if not data or 'inventory' not in data:
if not data: # So the browseable API will work
return Inventory.accessible_objects(self.user, 'admin_role').exists()
if 'inventory' not in data:
return False
# Checks for admin or change permission on inventory.
return self.check_related('inventory', Inventory, data)
@@ -2352,7 +2351,6 @@ class JobEventAccess(BaseAccess):
class UnpartitionedJobEventAccess(JobEventAccess):
model = UnpartitionedJobEvent
@@ -2697,46 +2695,66 @@ class ActivityStreamAccess(BaseAccess):
# 'job_template', 'job', 'project', 'project_update', 'workflow_job',
# 'inventory_source', 'workflow_job_template'
inventory_set = Inventory.accessible_objects(self.user, 'read_role')
credential_set = Credential.accessible_objects(self.user, 'read_role')
q = Q(user=self.user)
inventory_set = Inventory.accessible_pk_qs(self.user, 'read_role')
if inventory_set:
q |= (
Q(ad_hoc_command__inventory__in=inventory_set)
| Q(inventory__in=inventory_set)
| Q(host__inventory__in=inventory_set)
| Q(group__inventory__in=inventory_set)
| Q(inventory_source__inventory__in=inventory_set)
| Q(inventory_update__inventory_source__inventory__in=inventory_set)
)
credential_set = Credential.accessible_pk_qs(self.user, 'read_role')
if credential_set:
q |= Q(credential__in=credential_set)
auditing_orgs = (
(Organization.accessible_objects(self.user, 'admin_role') | Organization.accessible_objects(self.user, 'auditor_role'))
.distinct()
.values_list('id', flat=True)
)
project_set = Project.accessible_objects(self.user, 'read_role')
jt_set = JobTemplate.accessible_objects(self.user, 'read_role')
team_set = Team.accessible_objects(self.user, 'read_role')
wfjt_set = WorkflowJobTemplate.accessible_objects(self.user, 'read_role')
app_set = OAuth2ApplicationAccess(self.user).filtered_queryset()
token_set = OAuth2TokenAccess(self.user).filtered_queryset()
if auditing_orgs:
q |= (
Q(user__in=auditing_orgs.values('member_role__members'))
| Q(organization__in=auditing_orgs)
| Q(notification_template__organization__in=auditing_orgs)
| Q(notification__notification_template__organization__in=auditing_orgs)
| Q(label__organization__in=auditing_orgs)
| Q(role__in=Role.objects.filter(ancestors__in=self.user.roles.all()) if auditing_orgs else [])
)
return qs.filter(
Q(ad_hoc_command__inventory__in=inventory_set)
| Q(o_auth2_application__in=app_set)
| Q(o_auth2_access_token__in=token_set)
| Q(user__in=auditing_orgs.values('member_role__members'))
| Q(user=self.user)
| Q(organization__in=auditing_orgs)
| Q(inventory__in=inventory_set)
| Q(host__inventory__in=inventory_set)
| Q(group__inventory__in=inventory_set)
| Q(inventory_source__inventory__in=inventory_set)
| Q(inventory_update__inventory_source__inventory__in=inventory_set)
| Q(credential__in=credential_set)
| Q(team__in=team_set)
| Q(project__in=project_set)
| Q(project_update__project__in=project_set)
| Q(job_template__in=jt_set)
| Q(job__job_template__in=jt_set)
| Q(workflow_job_template__in=wfjt_set)
| Q(workflow_job_template_node__workflow_job_template__in=wfjt_set)
| Q(workflow_job__workflow_job_template__in=wfjt_set)
| Q(notification_template__organization__in=auditing_orgs)
| Q(notification__notification_template__organization__in=auditing_orgs)
| Q(label__organization__in=auditing_orgs)
| Q(role__in=Role.objects.filter(ancestors__in=self.user.roles.all()) if auditing_orgs else [])
).distinct()
project_set = Project.accessible_pk_qs(self.user, 'read_role')
if project_set:
q |= Q(project__in=project_set) | Q(project_update__project__in=project_set)
jt_set = JobTemplate.accessible_pk_qs(self.user, 'read_role')
if jt_set:
q |= Q(job_template__in=jt_set) | Q(job__job_template__in=jt_set)
wfjt_set = WorkflowJobTemplate.accessible_pk_qs(self.user, 'read_role')
if wfjt_set:
q |= (
Q(workflow_job_template__in=wfjt_set)
| Q(workflow_job_template_node__workflow_job_template__in=wfjt_set)
| Q(workflow_job__workflow_job_template__in=wfjt_set)
)
team_set = Team.accessible_pk_qs(self.user, 'read_role')
if team_set:
q |= Q(team__in=team_set)
app_set = OAuth2ApplicationAccess(self.user).filtered_queryset()
if app_set:
q |= Q(o_auth2_application__in=app_set)
token_set = OAuth2TokenAccess(self.user).filtered_queryset()
if token_set:
q |= Q(o_auth2_access_token__in=token_set)
return qs.filter(q).distinct()
def can_add(self, data):
return False

View File

@@ -2,6 +2,7 @@ import datetime
import asyncio
import logging
import redis
import redis.asyncio
import re
from prometheus_client import (
@@ -81,7 +82,7 @@ class BroadcastWebsocketStatsManager:
async def run_loop(self):
try:
redis_conn = await redis.asyncio.create_redis_pool(settings.BROKER_URL)
redis_conn = await redis.asyncio.Redis.from_url(settings.BROKER_URL)
while True:
stats_data_str = ''.join(stat.serialize() for stat in self._stats.values())
await redis_conn.set(self._redis_key, stats_data_str)
@@ -121,8 +122,8 @@ class BroadcastWebsocketStats:
'Number of messages received, to be forwarded, by the broadcast websocket system',
registry=self._registry,
)
self._messages_received = Gauge(
f'awx_{self.remote_name}_messages_received',
self._messages_received_current_conn = Gauge(
f'awx_{self.remote_name}_messages_received_currrent_conn',
'Number forwarded messages received by the broadcast websocket system, for the duration of the current connection',
registry=self._registry,
)
@@ -143,13 +144,13 @@ class BroadcastWebsocketStats:
def record_message_received(self):
self._internal_messages_received_per_minute.record()
self._messages_received.inc()
self._messages_received_current_conn.inc()
self._messages_received_total.inc()
def record_connection_established(self):
self._connection.state('connected')
self._connection_start.set_to_current_time()
self._messages_received.set(0)
self._messages_received_current_conn.set(0)
def record_connection_lost(self):
self._connection.state('disconnected')

View File

@@ -3,6 +3,5 @@ from django.utils.translation import gettext_lazy as _
class MainConfig(AppConfig):
name = 'awx.main'
verbose_name = _('Main')

View File

@@ -569,7 +569,7 @@ register(
register(
'LOG_AGGREGATOR_LOGGERS',
field_class=fields.StringListField,
default=['awx', 'activity_stream', 'job_events', 'system_tracking'],
default=['awx', 'activity_stream', 'job_events', 'system_tracking', 'broadcast_websocket'],
label=_('Loggers Sending Data to Log Aggregator Form'),
help_text=_(
'List of loggers that will send HTTP logs to the collector, these can '
@@ -577,7 +577,8 @@ register(
'awx - service logs\n'
'activity_stream - activity stream records\n'
'job_events - callback data from Ansible job events\n'
'system_tracking - facts gathered from scan jobs.'
'system_tracking - facts gathered from scan jobs\n'
'broadcast_websocket - errors pertaining to websockets broadcast metrics\n'
),
category=_('Logging'),
category_slug='logging',

View File

@@ -9,10 +9,16 @@ aim_inputs = {
'fields': [
{
'id': 'url',
'label': _('CyberArk AIM URL'),
'label': _('CyberArk CCP URL'),
'type': 'string',
'format': 'url',
},
{
'id': 'webservice_id',
'label': _('Web Service ID'),
'type': 'string',
'help_text': _('The CCP Web Service ID. Leave blank to default to AIMWebService.'),
},
{
'id': 'app_id',
'label': _('Application ID'),
@@ -64,10 +70,13 @@ def aim_backend(**kwargs):
client_cert = kwargs.get('client_cert', None)
client_key = kwargs.get('client_key', None)
verify = kwargs['verify']
webservice_id = kwargs.get('webservice_id', '')
app_id = kwargs['app_id']
object_query = kwargs['object_query']
object_query_format = kwargs['object_query_format']
reason = kwargs.get('reason', None)
if webservice_id == '':
webservice_id = 'AIMWebService'
query_params = {
'AppId': app_id,
@@ -78,7 +87,7 @@ def aim_backend(**kwargs):
query_params['reason'] = reason
request_qs = '?' + urlencode(query_params, quote_via=quote)
request_url = urljoin(url, '/'.join(['AIMWebService', 'api', 'Accounts']))
request_url = urljoin(url, '/'.join([webservice_id, 'api', 'Accounts']))
with CertFiles(client_cert, client_key) as cert:
res = requests.get(
@@ -92,4 +101,4 @@ def aim_backend(**kwargs):
return res.json()['Content']
aim_plugin = CredentialPlugin('CyberArk AIM Central Credential Provider Lookup', inputs=aim_inputs, backend=aim_backend)
aim_plugin = CredentialPlugin('CyberArk Central Credential Provider Lookup', inputs=aim_inputs, backend=aim_backend)

View File

@@ -68,7 +68,11 @@ def conjur_backend(**kwargs):
with CertFiles(cacert) as cert:
# https://www.conjur.org/api.html#authentication-authenticate-post
auth_kwargs['verify'] = cert
resp = requests.post(urljoin(url, '/'.join(['api', 'authn', account, username, 'authenticate'])), **auth_kwargs)
try:
resp = requests.post(urljoin(url, '/'.join(['authn', account, username, 'authenticate'])), **auth_kwargs)
resp.raise_for_status()
except requests.exceptions.HTTPError:
resp = requests.post(urljoin(url, '/'.join(['api', 'authn', account, username, 'authenticate'])), **auth_kwargs)
raise_for_status(resp)
token = resp.content.decode('utf-8')
@@ -78,14 +82,20 @@ def conjur_backend(**kwargs):
}
# https://www.conjur.org/api.html#secrets-retrieve-a-secret-get
path = urljoin(url, '/'.join(['api', 'secrets', account, 'variable', secret_path]))
path = urljoin(url, '/'.join(['secrets', account, 'variable', secret_path]))
path_conjurcloud = urljoin(url, '/'.join(['api', 'secrets', account, 'variable', secret_path]))
if version:
ver = "version={}".format(version)
path = '?'.join([path, ver])
path_conjurcloud = '?'.join([path_conjurcloud, ver])
with CertFiles(cacert) as cert:
lookup_kwargs['verify'] = cert
resp = requests.get(path, timeout=30, **lookup_kwargs)
try:
resp = requests.get(path, timeout=30, **lookup_kwargs)
resp.raise_for_status()
except requests.exceptions.HTTPError:
resp = requests.get(path_conjurcloud, timeout=30, **lookup_kwargs)
raise_for_status(resp)
return resp.text

View File

@@ -1,6 +1,7 @@
import copy
import os
import pathlib
import time
from urllib.parse import urljoin
from .plugin import CredentialPlugin, CertFiles, raise_for_status
@@ -247,7 +248,15 @@ def kv_backend(**kwargs):
request_url = urljoin(url, '/'.join(['v1'] + path_segments)).rstrip('/')
with CertFiles(cacert) as cert:
request_kwargs['verify'] = cert
response = sess.get(request_url, **request_kwargs)
request_retries = 0
while request_retries < 5:
response = sess.get(request_url, **request_kwargs)
# https://developer.hashicorp.com/vault/docs/enterprise/consistency
if response.status_code == 412:
request_retries += 1
time.sleep(1)
else:
break
raise_for_status(response)
json = response.json()
@@ -289,8 +298,15 @@ def ssh_backend(**kwargs):
with CertFiles(cacert) as cert:
request_kwargs['verify'] = cert
resp = sess.post(request_url, **request_kwargs)
request_retries = 0
while request_retries < 5:
resp = sess.post(request_url, **request_kwargs)
# https://developer.hashicorp.com/vault/docs/enterprise/consistency
if resp.status_code == 412:
request_retries += 1
time.sleep(1)
else:
break
raise_for_status(resp)
return resp.json()['data']['signed_key']

View File

@@ -14,7 +14,6 @@ logger = logging.getLogger('awx.main.dispatch')
class Control(object):
services = ('dispatcher', 'callback_receiver')
result = None

View File

@@ -192,7 +192,6 @@ class PoolWorker(object):
class StatefulPoolWorker(PoolWorker):
track_managed_tasks = True

View File

@@ -66,7 +66,6 @@ class task:
bind_kwargs = self.bind_kwargs
class PublisherMixin(object):
queue = None
@classmethod

View File

@@ -40,7 +40,6 @@ class WorkerSignalHandler:
class AWXConsumerBase(object):
last_stats = time.time()
def __init__(self, name, worker, queues=[], pool=None):

View File

@@ -3,14 +3,12 @@ import logging
import os
import signal
import time
import traceback
import datetime
from django.conf import settings
from django.utils.functional import cached_property
from django.utils.timezone import now as tz_now
from django.db import DatabaseError, OperationalError, transaction, connection as django_connection
from django.db.utils import InterfaceError, InternalError
from django.db import transaction, connection as django_connection
from django_guid import set_guid
import psutil
@@ -64,6 +62,7 @@ class CallbackBrokerWorker(BaseWorker):
"""
MAX_RETRIES = 2
INDIVIDUAL_EVENT_RETRIES = 3
last_stats = time.time()
last_flush = time.time()
total = 0
@@ -155,6 +154,8 @@ class CallbackBrokerWorker(BaseWorker):
metrics_events_missing_created = 0
metrics_total_job_event_processing_seconds = datetime.timedelta(seconds=0)
for cls, events in self.buff.items():
if not events:
continue
logger.debug(f'{cls.__name__}.objects.bulk_create({len(events)})')
for e in events:
e.modified = now # this can be set before created because now is set above on line 149
@@ -164,38 +165,48 @@ class CallbackBrokerWorker(BaseWorker):
else: # only calculate the seconds if the created time already has been set
metrics_total_job_event_processing_seconds += e.modified - e.created
metrics_duration_to_save = time.perf_counter()
saved_events = []
try:
cls.objects.bulk_create(events)
metrics_bulk_events_saved += len(events)
saved_events = events
self.buff[cls] = []
except Exception as exc:
logger.warning(f'Error in events bulk_create, will try indiviually up to 5 errors, error {str(exc)}')
# If the database is flaking, let ensure_connection throw a general exception
# will be caught by the outer loop, which goes into a proper sleep and retry loop
django_connection.ensure_connection()
logger.warning(f'Error in events bulk_create, will try indiviually, error: {str(exc)}')
# if an exception occurs, we should re-attempt to save the
# events one-by-one, because something in the list is
# broken/stale
consecutive_errors = 0
events_saved = 0
metrics_events_batch_save_errors += 1
for e in events:
for e in events.copy():
try:
e.save()
events_saved += 1
consecutive_errors = 0
metrics_singular_events_saved += 1
events.remove(e)
saved_events.append(e) # Importantly, remove successfully saved events from the buffer
except Exception as exc_indv:
consecutive_errors += 1
logger.info(f'Database Error Saving individual Job Event, error {str(exc_indv)}')
if consecutive_errors >= 5:
raise
metrics_singular_events_saved += events_saved
if events_saved == 0:
raise
retry_count = getattr(e, '_retry_count', 0) + 1
e._retry_count = retry_count
# special sanitization logic for postgres treatment of NUL 0x00 char
if (retry_count == 1) and isinstance(exc_indv, ValueError) and ("\x00" in e.stdout):
e.stdout = e.stdout.replace("\x00", "")
if retry_count >= self.INDIVIDUAL_EVENT_RETRIES:
logger.error(f'Hit max retries ({retry_count}) saving individual Event error: {str(exc_indv)}\ndata:\n{e.__dict__}')
events.remove(e)
else:
logger.info(f'Database Error Saving individual Event uuid={e.uuid} try={retry_count}, error: {str(exc_indv)}')
metrics_duration_to_save = time.perf_counter() - metrics_duration_to_save
for e in events:
for e in saved_events:
if not getattr(e, '_skip_websocket_message', False):
metrics_events_broadcast += 1
emit_event_detail(e)
if getattr(e, '_notification_trigger_event', False):
job_stats_wrapup(getattr(e, e.JOB_REFERENCE), event=e)
self.buff = {}
self.last_flush = time.time()
# only update metrics if we saved events
if (metrics_bulk_events_saved + metrics_singular_events_saved) > 0:
@@ -267,20 +278,16 @@ class CallbackBrokerWorker(BaseWorker):
try:
self.flush(force=flush)
break
except (OperationalError, InterfaceError, InternalError) as exc:
except Exception as exc:
# Aside form bugs, exceptions here are assumed to be due to database flake
if retries >= self.MAX_RETRIES:
logger.exception('Worker could not re-establish database connectivity, giving up on one or more events.')
self.buff = {}
return
delay = 60 * retries
logger.warning(f'Database Error Flushing Job Events, retry #{retries + 1} in {delay} seconds: {str(exc)}')
django_connection.close()
time.sleep(delay)
retries += 1
except DatabaseError:
logger.exception('Database Error Flushing Job Events')
django_connection.close()
break
except Exception as exc:
tb = traceback.format_exc()
logger.error('Callback Task Processor Raised Exception: %r', exc)
logger.error('Detail: {}'.format(tb))
except Exception:
logger.exception(f'Callback Task Processor Raised Unexpected Exception processing event data:\n{body}')

View File

@@ -232,7 +232,6 @@ class ImplicitRoleField(models.ForeignKey):
field_names = [field_names]
for field_name in field_names:
if field_name.startswith('singleton:'):
continue
@@ -244,7 +243,6 @@ class ImplicitRoleField(models.ForeignKey):
field = getattr(cls, field_name, None)
if field and type(field) is ReverseManyToOneDescriptor or type(field) is ManyToManyDescriptor:
if '.' in field_attr:
raise Exception('Referencing deep roles through ManyToMany fields is unsupported.')
@@ -629,7 +627,6 @@ class CredentialInputField(JSONSchemaField):
# `ssh_key_unlock` requirements are very specific and can't be
# represented without complicated JSON schema
if model_instance.credential_type.managed is True and 'ssh_key_unlock' in defined_fields:
# in order to properly test the necessity of `ssh_key_unlock`, we
# need to know the real value of `ssh_key_data`; for a payload like:
# {
@@ -791,7 +788,8 @@ class CredentialTypeInjectorField(JSONSchemaField):
'type': 'object',
'patternProperties': {
# http://docs.ansible.com/ansible/playbooks_variables.html#what-makes-a-valid-variable-name
'^[a-zA-Z_]+[a-zA-Z0-9_]*$': {'type': 'string'},
# plus, add ability to template
r'^[a-zA-Z_\{\}]+[a-zA-Z0-9_\{\}]*$': {"anyOf": [{'type': 'string'}, {'type': 'array'}, {'$ref': '#/properties/extra_vars'}]}
},
'additionalProperties': False,
},
@@ -858,27 +856,44 @@ class CredentialTypeInjectorField(JSONSchemaField):
template_name = template_name.split('.')[1]
setattr(valid_namespace['tower'].filename, template_name, 'EXAMPLE_FILENAME')
def validate_template_string(type_, key, tmpl):
try:
sandbox.ImmutableSandboxedEnvironment(undefined=StrictUndefined).from_string(tmpl).render(valid_namespace)
except UndefinedError as e:
raise django_exceptions.ValidationError(
_('{sub_key} uses an undefined field ({error_msg})').format(sub_key=key, error_msg=e),
code='invalid',
params={'value': value},
)
except SecurityError as e:
raise django_exceptions.ValidationError(_('Encountered unsafe code execution: {}').format(e))
except TemplateSyntaxError as e:
raise django_exceptions.ValidationError(
_('Syntax error rendering template for {sub_key} inside of {type} ({error_msg})').format(sub_key=key, type=type_, error_msg=e),
code='invalid',
params={'value': value},
)
def validate_extra_vars(key, node):
if isinstance(node, dict):
for k, v in node.items():
validate_template_string("extra_vars", 'a key' if key is None else key, k)
validate_extra_vars(k if key is None else "{key}.{k}".format(key=key, k=k), v)
elif isinstance(node, list):
for i, x in enumerate(node):
validate_extra_vars("{key}[{i}]".format(key=key, i=i), x)
else:
validate_template_string("extra_vars", key, node)
for type_, injector in value.items():
if type_ == 'env':
for key in injector.keys():
self.validate_env_var_allowed(key)
for key, tmpl in injector.items():
try:
sandbox.ImmutableSandboxedEnvironment(undefined=StrictUndefined).from_string(tmpl).render(valid_namespace)
except UndefinedError as e:
raise django_exceptions.ValidationError(
_('{sub_key} uses an undefined field ({error_msg})').format(sub_key=key, error_msg=e),
code='invalid',
params={'value': value},
)
except SecurityError as e:
raise django_exceptions.ValidationError(_('Encountered unsafe code execution: {}').format(e))
except TemplateSyntaxError as e:
raise django_exceptions.ValidationError(
_('Syntax error rendering template for {sub_key} inside of {type} ({error_msg})').format(sub_key=key, type=type_, error_msg=e),
code='invalid',
params={'value': value},
)
if type_ == 'extra_vars':
validate_extra_vars(None, injector)
else:
for key, tmpl in injector.items():
validate_template_string(type_, key, tmpl)
class AskForField(models.BooleanField):

View File

@@ -9,7 +9,6 @@ class Command(BaseCommand):
"""Checks connection to the database, and prints out connection info if not connected"""
def handle(self, *args, **options):
with connection.cursor() as cursor:
cursor.execute("SELECT version()")
version = str(cursor.fetchone()[0])

View File

@@ -82,7 +82,6 @@ class DeleteMeta:
part_drop = {}
for pk, status, created in self.jobs_qs:
part_key = partition_table_name(self.job_class, created)
if status in ['pending', 'waiting', 'running']:
part_drop[part_key] = False

View File

@@ -17,7 +17,6 @@ class Command(BaseCommand):
def handle(self, *args, **options):
if not options['user']:
raise CommandError('Username not supplied. Usage: awx-manage create_oauth2_token --user=username.')
try:
user = User.objects.get(username=options['user'])

View File

@@ -0,0 +1,143 @@
import time
from urllib.parse import urljoin
from argparse import ArgumentTypeError
from django.conf import settings
from django.core.management.base import BaseCommand, CommandError
from django.db.models import Q
from django.utils.timezone import now
from awx.main.models import Instance, UnifiedJob
class AWXInstance:
def __init__(self, **filter):
self.filter = filter
self.get_instance()
def get_instance(self):
filter = self.filter if self.filter is not None else dict(hostname=settings.CLUSTER_HOST_ID)
qs = Instance.objects.filter(**filter)
if not qs.exists():
raise ValueError(f"No AWX instance found with {filter} parameters")
self.instance = qs.first()
def disable(self):
if self.instance.enabled:
self.instance.enabled = False
self.instance.save()
return True
def enable(self):
if not self.instance.enabled:
self.instance.enabled = True
self.instance.save()
return True
def jobs(self):
return UnifiedJob.objects.filter(
Q(controller_node=self.instance.hostname) | Q(execution_node=self.instance.hostname), status__in=("running", "waiting")
)
def jobs_pretty(self):
jobs = []
for j in self.jobs():
job_started = j.started if j.started else now()
# similar calculation of `elapsed` as the corresponding serializer
# does
td = now() - job_started
elapsed = (td.microseconds + (td.seconds + td.days * 24 * 3600) * 10**6) / (10**6 * 1.0)
elapsed = float(elapsed)
details = dict(
name=j.name,
url=j.get_ui_url(),
elapsed=elapsed,
)
jobs.append(details)
jobs = sorted(jobs, reverse=True, key=lambda j: j["elapsed"])
return ", ".join([f"[\"{j['name']}\"]({j['url']})" for j in jobs])
def instance_pretty(self):
instance = (
self.instance.hostname,
urljoin(settings.TOWER_URL_BASE, f"/#/instances/{self.instance.pk}/details"),
)
return f"[\"{instance[0]}\"]({instance[1]})"
class Command(BaseCommand):
help = "Disable instance, optionally waiting for all its managed jobs to finish."
@staticmethod
def ge_1(arg):
if arg == "inf":
return float("inf")
int_arg = int(arg)
if int_arg < 1:
raise ArgumentTypeError(f"The value must be a positive number >= 1. Provided: \"{arg}\"")
return int_arg
def add_arguments(self, parser):
filter_group = parser.add_mutually_exclusive_group()
filter_group.add_argument(
"--hostname",
type=str,
default=settings.CLUSTER_HOST_ID,
help=f"{Instance.hostname.field.help_text} Defaults to the hostname of the machine where the Python interpreter is currently executing".strip(),
)
filter_group.add_argument("--id", type=self.ge_1, help=Instance.id.field.help_text)
parser.add_argument(
"--wait",
action="store_true",
help="Wait for jobs managed by the instance to finish. With default retry arguments waits ~1h",
)
parser.add_argument(
"--retry",
type=self.ge_1,
default=120,
help="Number of retries when waiting for jobs to finish. Default: 120. Also accepts \"inf\" to wait indefinitely",
)
parser.add_argument(
"--retry_sleep",
type=self.ge_1,
default=30,
help="Number of seconds to sleep before consequtive retries when waiting. Default: 30",
)
def handle(self, *args, **options):
try:
filter = dict(id=options["id"]) if options["id"] is not None else dict(hostname=options["hostname"])
instance = AWXInstance(**filter)
except ValueError as e:
raise CommandError(e)
if instance.disable():
self.stdout.write(self.style.SUCCESS(f"Instance {instance.instance_pretty()} has been disabled"))
else:
self.stdout.write(f"Instance {instance.instance_pretty()} has already been disabled")
if not options["wait"]:
return
rc = 1
while instance.jobs().count() > 0:
if rc < options["retry"]:
self.stdout.write(
f"{rc}/{options['retry']}: Waiting {options['retry_sleep']}s before the next attempt to see if the following instance' managed jobs have finished: {instance.jobs_pretty()}"
)
rc += 1
time.sleep(options["retry_sleep"])
else:
raise CommandError(
f"{rc}/{options['retry']}: No more retry attempts left, but the instance still has associated managed jobs: {instance.jobs_pretty()}"
)
else:
self.stdout.write(self.style.SUCCESS("Done waiting for instance' managed jobs to finish!"))

View File

@@ -0,0 +1,35 @@
from django.core.management.base import BaseCommand, CommandError
from django.conf import settings
class Command(BaseCommand):
"""enable or disable authentication system"""
def add_arguments(self, parser):
"""
This adds the --enable --disable functionalities to the command using mutally_exclusive to avoid situations in which users pass both flags
"""
group = parser.add_mutually_exclusive_group()
group.add_argument('--enable', dest='enable', action='store_true', help='Pass --enable to enable local authentication')
group.add_argument('--disable', dest='disable', action='store_true', help='Pass --disable to disable local authentication')
def _enable_disable_auth(self, enable, disable):
"""
this method allows the disabling or enabling of local authenication based on the argument passed into the parser
if no arguments throw a command error, if --enable set the DISABLE_LOCAL_AUTH to False
if --disable it's set to True. Realizing that the flag is counterintuitive to what is expected.
"""
if enable:
settings.DISABLE_LOCAL_AUTH = False
print("Setting has changed to {} allowing local authentication".format(settings.DISABLE_LOCAL_AUTH))
elif disable:
settings.DISABLE_LOCAL_AUTH = True
print("Setting has changed to {} disallowing local authentication".format(settings.DISABLE_LOCAL_AUTH))
else:
raise CommandError('Please pass --enable flag to allow local auth or --disable flag to disable local auth')
def handle(self, **options):
self._enable_disable_auth(options.get('enable'), options.get('disable'))

View File

@@ -10,7 +10,6 @@ from django.utils.text import slugify
class Command(BaseCommand):
help = 'Export custom inventory scripts into a tarfile.'
def add_arguments(self, parser):
@@ -21,7 +20,6 @@ class Command(BaseCommand):
with tempfile.TemporaryDirectory() as tmpdirname:
with tarfile.open(tar_filename, "w") as tar:
for cis in CustomInventoryScript.objects.all():
# naming convention similar to project paths
slug_name = slugify(str(cis.name)).replace(u'-', u'_')

View File

@@ -6,7 +6,6 @@ import json
class Command(BaseCommand):
help = 'This is for offline licensing usage'
def add_arguments(self, parser):

View File

@@ -934,7 +934,6 @@ class Command(BaseCommand):
# (even though inventory_import.Command.handle -- which calls
# perform_update -- has its own lock, inventory_ID_import)
with advisory_lock('inventory_{}_perform_update'.format(self.inventory.id)):
try:
self.check_license()
except PermissionDenied as e:

View File

@@ -6,7 +6,6 @@ from django.core.management.base import BaseCommand
class Ungrouped(object):
name = 'ungrouped'
policy_instance_percentage = None
policy_instance_minimum = None

View File

@@ -32,8 +32,14 @@ class Command(BaseCommand):
def handle(self, **options):
self.old_key = settings.SECRET_KEY
custom_key = os.environ.get("TOWER_SECRET_KEY")
if options.get("use_custom_key") and custom_key:
self.new_key = custom_key
if options.get("use_custom_key"):
if custom_key:
self.new_key = custom_key
else:
print("Use custom key was specified but the env var TOWER_SECRET_KEY was not available")
import sys
sys.exit(1)
else:
self.new_key = base64.encodebytes(os.urandom(33)).decode().rstrip()
self._notification_templates()

View File

@@ -7,7 +7,6 @@ from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
help = (
"Remove an instance (specified by --hostname) from the specified queue (instance group).\n"
"In order remove the queue, use the `unregister_queue` command."

View File

@@ -28,7 +28,6 @@ class JobStatusLifeCycle:
class ReplayJobEvents(JobStatusLifeCycle):
recording_start = None
replay_start = None
@@ -190,7 +189,6 @@ class ReplayJobEvents(JobStatusLifeCycle):
class Command(BaseCommand):
help = 'Replay job events over websockets ordered by created on date.'
def _parse_slice_range(self, slice_arg):

View File

@@ -7,7 +7,6 @@ from awx.main.models import CredentialType
class Command(BaseCommand):
help = 'Load default managed credential types.'
def handle(self, *args, **options):

View File

@@ -10,7 +10,6 @@ from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
help = (
"Remove specified queue (instance group) from database.\n"
"Instances inside of queue will continue to exist, \n"

View File

@@ -158,7 +158,11 @@ class InstanceManager(models.Manager):
return (False, instance)
# Create new instance, and fill in default values
create_defaults = {'node_state': Instance.States.INSTALLED, 'capacity': 0}
create_defaults = {
'node_state': Instance.States.INSTALLED,
'capacity': 0,
'listener_port': 27199,
}
if defaults is not None:
create_defaults.update(defaults)
uuid_option = {}

View File

@@ -38,7 +38,6 @@ class SettingsCacheMiddleware(MiddlewareMixin):
class TimingMiddleware(threading.local, MiddlewareMixin):
dest = '/var/log/tower/profile'
def __init__(self, *args, **kwargs):

View File

@@ -14,7 +14,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [
('taggit', '0002_auto_20150616_2121'),
('contenttypes', '0002_remove_content_type_name'),

View File

@@ -12,7 +12,6 @@ from ._squashed_30 import SQUASHED_30
class Migration(migrations.Migration):
dependencies = [
('main', '0003_squashed_v300_v303_updates'),
]

View File

@@ -7,7 +7,6 @@ from ._squashed_31 import SQUASHED_31
class Migration(migrations.Migration):
dependencies = [
('main', '0004_squashed_v310_release'),
]

View File

@@ -26,7 +26,6 @@ def replaces():
class Migration(migrations.Migration):
dependencies = [
('main', '0005_squashed_v310_v313_updates'),
]

View File

@@ -7,7 +7,6 @@ from awx.main.migrations import ActivityStreamDisabledMigration
class Migration(ActivityStreamDisabledMigration):
dependencies = [
('main', '0006_v320_release'),
]

View File

@@ -11,7 +11,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [
('main', '0007_v320_data_migrations'),
]

View File

@@ -7,7 +7,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [
('main', '0008_v320_drop_v1_credential_fields'),
]

View File

@@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0009_v322_add_setting_field_for_activity_stream'),
]

View File

@@ -5,7 +5,6 @@ from awx.main.migrations import ActivityStreamDisabledMigration
class Migration(ActivityStreamDisabledMigration):
dependencies = [
('main', '0010_v322_add_ovirt4_tower_inventory'),
]

View File

@@ -5,7 +5,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('main', '0011_v322_encrypt_survey_passwords'),
]

View File

@@ -9,7 +9,6 @@ from awx.main.migrations._multi_cred import migrate_to_multi_cred, migrate_back_
class Migration(migrations.Migration):
dependencies = [
('main', '0012_v322_update_cred_types'),
]

View File

@@ -11,7 +11,6 @@ from awx.main.migrations._scan_jobs import remove_scan_type_nodes
class Migration(migrations.Migration):
dependencies = [
('main', '0013_v330_multi_credential'),
]

View File

@@ -7,7 +7,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('main', '0014_v330_saved_launchtime_configs'),
]

View File

@@ -7,7 +7,6 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('main', '0015_v330_blank_start_args'),
]

View File

@@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0016_v330_non_blank_workflow'),
]

View File

@@ -9,7 +9,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [
('main', '0017_v330_move_deprecated_stdout'),
]

View File

@@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0018_v330_add_additional_stdout_events'),
]

View File

@@ -8,7 +8,6 @@ import awx.main.fields
class Migration(migrations.Migration):
dependencies = [
('main', '0019_v330_custom_virtualenv'),
]

View File

@@ -8,7 +8,6 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('main', '0020_v330_instancegroup_policies'),
]

View File

@@ -8,7 +8,6 @@ from awx.main.migrations import _migration_utils as migration_utils
class Migration(ActivityStreamDisabledMigration):
dependencies = [
('main', '0021_v330_declare_new_rbac_roles'),
]

View File

@@ -9,7 +9,6 @@ from awx.main.migrations._multi_cred import migrate_inventory_source_cred, migra
class Migration(migrations.Migration):
dependencies = [
('main', '0022_v330_create_new_rbac_roles'),
]

View File

@@ -8,7 +8,6 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('sessions', '0001_initial'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),

Some files were not shown because too many files have changed in this diff Show More