Compare commits

...

72 Commits

Author SHA1 Message Date
John Westcott IV
af59abbbc4 Fixing NUL characters in event data 2023-05-02 14:37:35 -04:00
John Westcott IV
8ab3514428 Fixing ValueError becoming DataError 2023-05-02 11:47:12 -04:00
John Westcott IV
98781a82c7 Merge branch 'feature-django-upgrade' of github.com:ansible/awx into feature-django-upgrade 2023-05-02 11:45:51 -04:00
John Westcott IV
d3fabe81d1 Fixing using QuerySet.iterator() after prefetch_related() without specifying chunk_size is deprecated 2023-04-28 15:32:20 -04:00
John Westcott IV
b274d0e5ef Removing deprecated django.utils.timezone.utc alias in favor of datetime.timezone.utc 2023-04-28 15:32:20 -04:00
John Westcott IV
4494412f0c Replacing depricated index_togeather with new indexes 2023-04-28 15:31:28 -04:00
John Westcott IV
b82bec7d04 Replacing psycopg2.copy_expert with psycopg3.copy 2023-04-28 12:35:49 -04:00
John Westcott IV
2cee1caad2 Fixing final CI error 2023-04-28 12:35:49 -04:00
John Westcott IV
c3045b1169 Updating old migrations for psycopg3
We have both psycopg2 and 3 installed in the AWX venv.

Old versions of Django only used psycopg2 but 4.2 now supports 3

Django 4.2 detects psycopg3 first and will use that over psycopg2

So old migrations needed to be updated to support psycopg3
2023-04-28 12:35:49 -04:00
John Westcott IV
27024378bc Upgrading djgno to 4.2 LTS 2023-04-28 12:35:49 -04:00
John Westcott IV
8eff90d4c0 Adding upgrade to django-oauth-toolkit pre-migraiton 2023-04-28 12:35:49 -04:00
Alan Rominger
77175d2862 Consolidate get_queryset methods (#13906)
In a prior merge, we added the ability to slap filter_read_permission = False on a view to get a certain functionality where it didn't filter a sublist the view is showing.

This logic already existed in a highly duplicated form among a number of views, so this deletes those methods in favor of the flag.
2023-04-28 09:10:18 -04:00
John Westcott IV
9b633b6492 Fixing final CI error 2023-04-27 08:00:56 -04:00
Klaas Demter
22464a5838 Enhance secret retrieval documentation (#13914) 2023-04-26 19:32:40 +00:00
Sarah Akus
3919ea6270 Merge pull request #13905 from vidyanambiar/topology-rbac
Make Topology view and Instances visible only to system admin/auditor
2023-04-26 15:13:32 -04:00
jessicamack
66a3cb6b09 Merge pull request #13858 from jessicamack/13322-catch-sigterm
Catch SIGTERM or SIGINT and send offline message
2023-04-26 12:24:34 -04:00
jessicamack
d282393035 change exit code
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
jessicamack
6ea3b20912 reverse previous commit to break into separate PR
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
jessicamack
3025ef0dfa move with block inside of while to free up persistent db connection
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
jessicamack
397d58c459 removed TODO. moved signal catches to handle()
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
jessicamack
d739a4a90a updated black and ran again to fix lint formatting
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
jessicamack
3fe64ad101 fix signal handler. black reformats
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
jessicamack
919d1e5d40 catch SIGTERM or SIGINT and send offline message
Signed-off-by: jessicamack <jmack@redhat.com>
2023-04-26 11:37:59 -04:00
John Westcott IV
11dbc56ecb Updating old migrations for psycopg3
We have both psycopg2 and 3 installed in the AWX venv.

Old versions of Django only used psycopg2 but 4.2 now supports 3

Django 4.2 detects psycopg3 first and will use that over psycopg2

So old migrations needed to be updated to support psycopg3
2023-04-26 09:10:25 -04:00
John Westcott IV
4c1bd1e88e Upgrading djgno to 4.2 LTS 2023-04-26 09:10:25 -04:00
John Westcott IV
865cb7518e Adding upgrade to django-oauth-toolkit pre-migraiton 2023-04-26 09:10:25 -04:00
John Westcott IV
7fda4b0675 Merge pull request #13903 from john-westcott-iv/collection_intergration_tests
Enhance collection intergration tests
2023-04-26 09:08:00 -04:00
Gabriel Muniz
d8af19d169 Fix organization not showing all galaxy credentials for org admin (#13676)
* Fix organization not showing all galaxy credentials for org admin

* Add basic test to ensure counts

* refactored approach to allow removal of redundant code

* Allow configurable prefetch_related

* implicitly get related fields

* Removed extra queryset code
2023-04-25 15:33:42 -04:00
Vidya Nambiar
1821e540f7 Merge branch 'devel' into topology-rbac 2023-04-25 15:32:17 -04:00
Vidya Nambiar
77be6c7495 tests 2023-04-25 14:18:05 -04:00
John Westcott IV
baed869d93 Remove project_manual integration test
This test can no longer be performed without manual intervention because of how jobs are now run in EEs
2023-04-25 13:49:50 -04:00
John Westcott IV
b87ff45c07 Enhance collection test
ad_hoc_command_cancel really can no longer timeout on a cancel (it happens sub second) and remove unneeded block

Modified all test to respect test_id parameter so that all tests can be run togeather as a single ID

Fix a check in group since its group2 is deleted from being a sub group of group1

The UI now allows to propage sub groups to the inventory which we may want to support within the collection

Only run instance integration test if we are running on k8s and assume we are not by default

Fix hard coded names in manual_project
2023-04-25 13:48:37 -04:00
Alan Rominger
7acc0067f5 Remove Ansible config override to validate group names (#13837) 2023-04-25 13:37:13 -04:00
Alan Rominger
0a13762f11 Use separate module for pytest settings (#13895)
* Use separate module for test settings

* Further refine some pre-existing comments in settings

* Add CACHES to setting snapshot exceptions to accommodate changed load order
2023-04-25 13:31:46 -04:00
Vidya Nambiar
2c673c8f1f Make Topology view and Instances visible only to system admin/auditor 2023-04-25 12:44:27 -04:00
John Westcott IV
8c187c74fc Adding "password": "$encrypted$" to user serializer (#13704)
Co-authored-by: Jessica Steurer <70719005+jay-steurer@users.noreply.github.com>
2023-04-25 10:18:01 -03:00
Jesse Wattenbarger
2ce9440bab Merge pull request #13896 from jjwatt/jjwatt-pyver
Fallback on PYTHON path in Makefile
2023-04-24 10:10:30 -04:00
Jesse Wattenbarger
765487390f Fallback on PYTHON path in Makefile
- Change default PYTHON in Makefile to be ranked choice
- Fix `PYTHON_VERSION` target that expects just a word
- Use native GNU Make `$(subst ,,)` instead of `sed`
- Add 'version-for-buildyml' target to simplify ci

If I understand correctly, this change should make
'$(PYTHON)' work how we want it to everywhere. Before
this change, on develpers' machines that don't have
a 'python3.9' in their path, make would fail. With this
change, we will prefer python3.9 if it's available, but
we'll take python3 otherwise.
2023-04-21 09:50:05 -04:00
Alan Rominger
086722149c Avoid recursive include of DEFAULT_SETTINGS, add sanity test (#13236)
* Avoid recursive include of DEFAULT_SETTINGS, add sanity test to avoid similar surprises

* Implement review comments for more clear code order and readability

* Clarify comment about order of app name, which is last in order so that it can modify user settings
2023-04-20 15:15:34 -04:00
Sarah Akus
c10ada6f44 Merge pull request #13876 from marshmalien/9668-adhoc-credentials-search
Fix credentials search in adhoc prompt modal
2023-04-20 13:41:36 -04:00
Sarah Akus
b350cd053d Merge pull request #13886 from marshmalien/fix-wf-approval-job-details
Fix incorrect workflow approval job details
2023-04-20 13:31:32 -04:00
Alan Rominger
d0acb1c53f Delete cp of local_settings.py file in test running, because path no longer exists (#13894)
* Change reference to moved local_settings.py file

* Do not appy local_settings to test runner
2023-04-20 13:19:00 -04:00
Hao Liu
f61b73010a Merge pull request #13889 from TheRealHaoLiu/egg-liminate
Remove unnecessary egg-link linking
2023-04-19 17:12:28 -04:00
Hao Liu
adb89cd48f Remove unnecessary egg-link linking
we link awx.egg-link from `tools/docker-compose/awx.egg-link` to `/tmp/awx.egg-link` than we move `/tmp/awx.egg-link` to `/var/lib/awx/venv/awx/lib/python3.9/site-packages/awx.egg-link`

bonus... now we dont have to set PYTHON=python3.9
2023-04-19 16:36:51 -04:00
Hao Liu
3e509b3d55 Merge pull request #13883 from ZitaNemeckova/remove_inventories_from_host_metrics
Remove Inventories column for now
2023-04-19 15:41:32 -04:00
Hao Liu
f0badea9d3 Merge pull request #13888 from TheRealHaoLiu/correct-make-call-make
Make target should not call make directly
2023-04-19 15:38:58 -04:00
Hao Liu
6a1ec0dc89 Merge pull request #13887 from TheRealHaoLiu/no-make-run-stuff-in-docker-compose
Stop using make to start awx processes part 1
2023-04-19 15:35:32 -04:00
Hao Liu
329fb88bbb Make target should not call make directly
https://www.gnu.org/software/make/manual/html_node/MAKE-Variable.html

make target should always call make with $(MAKE)
2023-04-19 15:01:16 -04:00
Hao Liu
177f8cb7b2 Stop using make to start processes
part 1...

we dont need to run awx processes through make
because awx-manage uses awx-python which is already activating the correct venv
2023-04-19 14:51:38 -04:00
Marliana Lara
b43107a5e9 Fix credentials search in adhoc prompt modal 2023-04-19 13:59:08 -04:00
Marliana Lara
e7c80fe1e8 Fix incorrect workflow approval job details 2023-04-19 13:57:05 -04:00
Hao Liu
33f1c35292 Merge pull request #13658 from TheRealHaoLiu/different-dockerfile
Use different dockerfile for docker-compose-build
2023-04-19 12:12:54 -04:00
Hao Liu
ba899324f2 Merge pull request #13856 from TheRealHaoLiu/kube-dev-autoreload
Auto reload services in kube dev env
2023-04-19 12:08:52 -04:00
Hao Liu
9c236eb8dd Merge pull request #13882 from TheRealHaoLiu/link-launch-n-supervisord
Link launch script and supervisor conf in kube dev
2023-04-19 12:03:22 -04:00
Zita Nemeckova
36559a4539 Remove Inventories column for now. Revert this commit once the backend is ready. 2023-04-19 15:55:02 +02:00
Hao Liu
7a4b3ed139 Merge pull request #13881 from TheRealHaoLiu/fix-copy
Fix copy API
2023-04-19 09:39:39 -04:00
Gabriel Muniz
cd5cc64d6a Fix 500 on missing inventory for provisioning callbacks (#13862)
* Fix 500 on missing inventory for provisioning callbacks

* Added test to cover bug fix

* Reworded msg to clear what is missing to start the callback
2023-04-19 09:27:41 -04:00
Hao Liu
71a11ea3ad Link launch script and supervisor conf in kube dev
Linking launch script and supervisor conf file in kube development environment so we no longer have to rebuild kube devel images for superviosr conf file and launch script changes
2023-04-18 23:22:53 -04:00
Hao Liu
cfbbc4cb92 Auto reload services in kube dev env 2023-04-18 23:15:47 -04:00
Hao Liu
592920ee51 Use different dockerfile for docker-compose-build
- use different dockerfile for awx_devel and awx image
- make all Dockerfile* targets PHONY (bc its cheap to run)
- fix HEADLESS not working for awx-kube-build
2023-04-18 21:45:31 -04:00
Hao Liu
b75b84e282 Merge pull request #13725 from l3acon/collection-existential-state-for-credential-module
[collection] Add "exists" state for credential module
2023-04-18 20:51:14 -04:00
Sarah Akus
f4b80c70e3 Merge pull request #13849 from marshmalien/10854-instances-403-error
Check user permissions before fetching system settings
2023-04-18 16:41:40 -04:00
Hao Liu
9870187af5 Fix copy API
In web/task split deployment web and task container no longer share the same redis cache

In the original code we use redis cache to pass the list of sub objects that need to be copied to the new object

In this PR we extracted out the logic that computes the sub_object_list and move it into deep_copy_model_obj task
2023-04-18 16:03:04 -04:00
Matthew Fernandez
d57f549a4c Merge branch 'devel' into collection-existential-state-for-credential-module 2023-04-18 09:51:54 -06:00
matt
93e6f974f6 remove redundant loop 2023-04-18 09:51:20 -06:00
Michael Abashian
68b32b9b4f Merge branch 'devel' into 10854-instances-403-error 2023-04-17 10:14:44 -04:00
Marliana Lara
105609ec20 Check user permissions before fetching system settings 2023-04-12 11:19:37 -04:00
matt
4a3d437b32 spaces for pep8 2023-04-11 11:35:36 -06:00
Matthew Fernandez
184719e9f2 Merge branch 'devel' into collection-existential-state-for-credential-module 2023-04-10 15:31:11 -06:00
matt
b0c416334f add test coverage 2023-03-23 15:44:00 -06:00
matt
7c4aedf716 exit from module 2023-03-20 13:36:24 -06:00
matt
76f03b9adc add exists to awx.awx.credential 2023-03-20 09:59:24 -06:00
105 changed files with 2081 additions and 691 deletions

3
.gitignore vendored
View File

@@ -157,9 +157,10 @@ use_dev_supervisor.txt
*.unison.tmp
*.#
/awx/ui/.ui-built
/Dockerfile
/_build/
/_build_kube_dev/
/Dockerfile
/Dockerfile.dev
/Dockerfile.kube-dev
awx/ui_next/src

View File

@@ -1,6 +1,6 @@
-include awx/ui_next/Makefile
PYTHON ?= python3.9
PYTHON := $(notdir $(shell for i in python3.9 python3; do command -v $$i; done|sed 1q))
DOCKER_COMPOSE ?= docker-compose
OFFICIAL ?= no
NODE ?= node
@@ -296,13 +296,13 @@ swagger: reports
check: black
api-lint:
BLACK_ARGS="--check" make black
BLACK_ARGS="--check" $(MAKE) black
flake8 awx
yamllint -s .
## Run egg_info_dev to generate awx.egg-info for development.
awx-link:
[ -d "/awx_devel/awx.egg-info" ] || $(PYTHON) /awx_devel/tools/scripts/egg_info_dev
cp -f /tmp/awx.egg-link /var/lib/awx/venv/awx/lib/$(PYTHON)/site-packages/awx.egg-link
TEST_DIRS ?= awx/main/tests/unit awx/main/tests/functional awx/conf/tests awx/sso/tests
PYTEST_ARGS ?= -n auto
@@ -321,7 +321,7 @@ github_ci_setup:
# CI_GITHUB_TOKEN is defined in .github files
echo $(CI_GITHUB_TOKEN) | docker login ghcr.io -u $(GITHUB_ACTOR) --password-stdin
docker pull $(DEVEL_IMAGE_NAME) || : # Pre-pull image to warm build cache
make docker-compose-build
$(MAKE) docker-compose-build
## Runs AWX_DOCKER_CMD inside a new docker container.
docker-runner:
@@ -371,7 +371,7 @@ test_collection_sanity:
rm -rf $(COLLECTION_INSTALL)
if ! [ -x "$(shell command -v ansible-test)" ]; then pip install ansible-core; fi
ansible --version
COLLECTION_VERSION=1.0.0 make install_collection
COLLECTION_VERSION=1.0.0 $(MAKE) install_collection
cd $(COLLECTION_INSTALL) && ansible-test sanity $(COLLECTION_SANITY_ARGS)
test_collection_integration: install_collection
@@ -556,12 +556,21 @@ docker-compose-container-group-clean:
fi
rm -rf tools/docker-compose-minikube/_sources/
## Base development image build
docker-compose-build:
ansible-playbook tools/ansible/dockerfile.yml -e build_dev=True -e receptor_image=$(RECEPTOR_IMAGE)
DOCKER_BUILDKIT=1 docker build -t $(DEVEL_IMAGE_NAME) \
--build-arg BUILDKIT_INLINE_CACHE=1 \
--cache-from=$(DEV_DOCKER_TAG_BASE)/awx_devel:$(COMPOSE_TAG) .
.PHONY: Dockerfile.dev
## Generate Dockerfile.dev for awx_devel image
Dockerfile.dev: tools/ansible/roles/dockerfile/templates/Dockerfile.j2
ansible-playbook tools/ansible/dockerfile.yml \
-e dockerfile_name=Dockerfile.dev \
-e build_dev=True \
-e receptor_image=$(RECEPTOR_IMAGE)
## Build awx_devel image for docker compose development environment
docker-compose-build: Dockerfile.dev
DOCKER_BUILDKIT=1 docker build \
-f Dockerfile.dev \
-t $(DEVEL_IMAGE_NAME) \
--build-arg BUILDKIT_INLINE_CACHE=1 \
--cache-from=$(DEV_DOCKER_TAG_BASE)/awx_devel:$(COMPOSE_TAG) .
docker-clean:
-$(foreach container_id,$(shell docker ps -f name=tools_awx -aq && docker ps -f name=tools_receptor -aq),docker stop $(container_id); docker rm -f $(container_id);)
@@ -580,7 +589,7 @@ docker-compose-cluster-elk: awx/projects docker-compose-sources
$(DOCKER_COMPOSE) -f tools/docker-compose/_sources/docker-compose.yml -f tools/elastic/docker-compose.logstash-link-cluster.yml -f tools/elastic/docker-compose.elastic-override.yml up --no-recreate
docker-compose-container-group:
MINIKUBE_CONTAINER_GROUP=true make docker-compose
MINIKUBE_CONTAINER_GROUP=true $(MAKE) docker-compose
clean-elk:
docker stop tools_kibana_1
@@ -597,12 +606,36 @@ VERSION:
@echo "awx: $(VERSION)"
PYTHON_VERSION:
@echo "$(PYTHON)" | sed 's:python::'
@echo "$(subst python,,$(PYTHON))"
.PHONY: version-for-buildyml
version-for-buildyml:
@echo $(firstword $(subst +, ,$(VERSION)))
# version-for-buildyml prints a special version string for build.yml,
# chopping off the sha after the '+' sign.
# tools/ansible/build.yml was doing this: make print-VERSION | cut -d + -f -1
# This does the same thing in native make without
# the pipe or the extra processes, and now the pb does `make version-for-buildyml`
# Example:
# 22.1.1.dev38+g523c0d9781 becomes 22.1.1.dev38
.PHONY: Dockerfile
## Generate Dockerfile for awx image
Dockerfile: tools/ansible/roles/dockerfile/templates/Dockerfile.j2
ansible-playbook tools/ansible/dockerfile.yml -e receptor_image=$(RECEPTOR_IMAGE)
ansible-playbook tools/ansible/dockerfile.yml \
-e receptor_image=$(RECEPTOR_IMAGE) \
-e headless=$(HEADLESS)
## Build awx image for deployment on Kubernetes environment.
awx-kube-build: Dockerfile
DOCKER_BUILDKIT=1 docker build -f Dockerfile \
--build-arg VERSION=$(VERSION) \
--build-arg SETUPTOOLS_SCM_PRETEND_VERSION=$(VERSION) \
--build-arg HEADLESS=$(HEADLESS) \
-t $(DEV_DOCKER_TAG_BASE)/awx:$(COMPOSE_TAG) .
.PHONY: Dockerfile.kube-dev
## Generate Docker.kube-dev for awx_kube_devel image
Dockerfile.kube-dev: tools/ansible/roles/dockerfile/templates/Dockerfile.j2
ansible-playbook tools/ansible/dockerfile.yml \
-e dockerfile_name=Dockerfile.kube-dev \
@@ -617,13 +650,6 @@ awx-kube-dev-build: Dockerfile.kube-dev
--cache-from=$(DEV_DOCKER_TAG_BASE)/awx_kube_devel:$(COMPOSE_TAG) \
-t $(DEV_DOCKER_TAG_BASE)/awx_kube_devel:$(COMPOSE_TAG) .
## Build awx image for deployment on Kubernetes environment.
awx-kube-build: Dockerfile
DOCKER_BUILDKIT=1 docker build -f Dockerfile \
--build-arg VERSION=$(VERSION) \
--build-arg SETUPTOOLS_SCM_PRETEND_VERSION=$(VERSION) \
--build-arg HEADLESS=$(HEADLESS) \
-t $(DEV_DOCKER_TAG_BASE)/awx:$(COMPOSE_TAG) .
# Translation TASKS
# --------------------------------------
@@ -643,6 +669,7 @@ messages:
fi; \
$(PYTHON) manage.py makemessages -l en_us --keep-pot
.PHONY: print-%
print-%:
@echo $($*)
@@ -654,12 +681,12 @@ HELP_FILTER=.PHONY
## Display help targets
help:
@printf "Available targets:\n"
@make -s help/generate | grep -vE "\w($(HELP_FILTER))"
@$(MAKE) -s help/generate | grep -vE "\w($(HELP_FILTER))"
## Display help for all targets
help/all:
@printf "Available targets:\n"
@make -s help/generate
@$(MAKE) -s help/generate
## Generate help output from MAKEFILE_LIST
help/generate:
@@ -683,4 +710,4 @@ help/generate:
## Display help for ui-next targets
help/ui-next:
@make -s help MAKEFILE_LIST="awx/ui_next/Makefile"
@$(MAKE) -s help MAKEFILE_LIST="awx/ui_next/Makefile"

View File

@@ -5,13 +5,11 @@
import inspect
import logging
import time
import uuid
# Django
from django.conf import settings
from django.contrib.auth import views as auth_views
from django.contrib.contenttypes.models import ContentType
from django.core.cache import cache
from django.core.exceptions import FieldDoesNotExist
from django.db import connection, transaction
from django.db.models.fields.related import OneToOneRel
@@ -35,7 +33,7 @@ from rest_framework.negotiation import DefaultContentNegotiation
# AWX
from awx.api.filters import FieldLookupBackend
from awx.main.models import UnifiedJob, UnifiedJobTemplate, User, Role, Credential, WorkflowJobTemplateNode, WorkflowApprovalTemplate
from awx.main.access import access_registry
from awx.main.access import optimize_queryset
from awx.main.utils import camelcase_to_underscore, get_search_fields, getattrd, get_object_or_400, decrypt_field, get_awx_version
from awx.main.utils.db import get_all_field_names
from awx.main.utils.licensing import server_product_name
@@ -364,12 +362,7 @@ class GenericAPIView(generics.GenericAPIView, APIView):
return self.queryset._clone()
elif self.model is not None:
qs = self.model._default_manager
if self.model in access_registry:
access_class = access_registry[self.model]
if access_class.select_related:
qs = qs.select_related(*access_class.select_related)
if access_class.prefetch_related:
qs = qs.prefetch_related(*access_class.prefetch_related)
qs = optimize_queryset(qs)
return qs
else:
return super(GenericAPIView, self).get_queryset()
@@ -512,6 +505,9 @@ class SubListAPIView(ParentMixin, ListAPIView):
# And optionally (user must have given access permission on parent object
# to view sublist):
# parent_access = 'read'
# filter_read_permission sets whether or not to override the default intersection behavior
# implemented here
filter_read_permission = True
def get_description_context(self):
d = super(SubListAPIView, self).get_description_context()
@@ -526,8 +522,10 @@ class SubListAPIView(ParentMixin, ListAPIView):
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
qs = self.request.user.get_queryset(self.model).distinct()
sublist_qs = self.get_sublist_queryset(parent)
if not self.filter_read_permission:
return optimize_queryset(sublist_qs)
qs = self.request.user.get_queryset(self.model).distinct()
return qs & sublist_qs
def get_sublist_queryset(self, parent):
@@ -967,16 +965,11 @@ class CopyAPIView(GenericAPIView):
if hasattr(new_obj, 'admin_role') and request.user not in new_obj.admin_role.members.all():
new_obj.admin_role.members.add(request.user)
if sub_objs:
# store the copied object dict into cache, because it's
# often too large for postgres' notification bus
# (which has a default maximum message size of 8k)
key = 'deep-copy-{}'.format(str(uuid.uuid4()))
cache.set(key, sub_objs, timeout=3600)
permission_check_func = None
if hasattr(type(self), 'deep_copy_permission_check_func'):
permission_check_func = (type(self).__module__, type(self).__name__, 'deep_copy_permission_check_func')
trigger_delayed_deep_copy(
self.model.__module__, self.model.__name__, obj.pk, new_obj.pk, request.user.pk, key, permission_check_func=permission_check_func
self.model.__module__, self.model.__name__, obj.pk, new_obj.pk, request.user.pk, permission_check_func=permission_check_func
)
serializer = self._get_copy_return_serializer(new_obj)
headers = {'Location': new_obj.get_absolute_url(request=request)}

View File

@@ -954,7 +954,7 @@ class UnifiedJobStdoutSerializer(UnifiedJobSerializer):
class UserSerializer(BaseSerializer):
password = serializers.CharField(required=False, default='', write_only=True, help_text=_('Write-only field used to change the password.'))
password = serializers.CharField(required=False, default='', help_text=_('Field used to change the password.'))
ldap_dn = serializers.CharField(source='profile.ldap_dn', read_only=True)
external_account = serializers.SerializerMethodField(help_text=_('Set if the account is managed by an external service'))
is_system_auditor = serializers.BooleanField(default=False)
@@ -981,7 +981,12 @@ class UserSerializer(BaseSerializer):
def to_representation(self, obj):
ret = super(UserSerializer, self).to_representation(obj)
ret.pop('password', None)
if self.get_external_account(obj):
# If this is an external account it shouldn't have a password field
ret.pop('password', None)
else:
# If its an internal account lets assume there is a password and return $encrypted$ to the user
ret['password'] = '$encrypted$'
if obj and type(self) is UserSerializer:
ret['auth'] = obj.social_auth.values('provider', 'uid')
return ret
@@ -1019,7 +1024,7 @@ class UserSerializer(BaseSerializer):
# For now we're not raising an error, just not saving password for
# users managed by LDAP who already have an unusable password set.
# Get external password will return something like ldap or enterprise or None if the user isn't external. We only want to allow a password update for a None option
if new_password and not self.get_external_account(obj):
if new_password and new_password != '$encrypted$' and not self.get_external_account(obj):
obj.set_password(new_password)
obj.save(update_fields=['password'])
@@ -2185,7 +2190,7 @@ class BulkHostCreateSerializer(serializers.Serializer):
host_data = []
for r in result:
item = {k: getattr(r, k) for k in return_keys}
if not settings.IS_TESTING_MODE:
if settings.DATABASES and ('sqlite3' not in settings.DATABASES.get('default', {}).get('ENGINE')):
# sqlite acts different with bulk_create -- it doesn't return the id of the objects
# to get it, you have to do an additional query, which is not useful for our tests
item['url'] = reverse('api:host_detail', kwargs={'pk': r.id})

View File

@@ -62,7 +62,7 @@ from wsgiref.util import FileWrapper
# AWX
from awx.main.tasks.system import send_notifications, update_inventory_computed_fields
from awx.main.access import get_user_queryset, HostAccess
from awx.main.access import get_user_queryset
from awx.api.generics import (
APIView,
BaseUsersList,
@@ -794,13 +794,7 @@ class ExecutionEnvironmentActivityStreamList(SubListAPIView):
parent_model = models.ExecutionEnvironment
relationship = 'activitystream_set'
search_fields = ('changes',)
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
qs = self.request.user.get_queryset(self.model)
return qs.filter(execution_environment=parent)
filter_read_permission = False
class ProjectList(ListCreateAPIView):
@@ -1634,13 +1628,7 @@ class InventoryHostsList(HostRelatedSearchMixin, SubListCreateAttachDetachAPIVie
parent_model = models.Inventory
relationship = 'hosts'
parent_key = 'inventory'
def get_queryset(self):
inventory = self.get_parent_object()
qs = getattrd(inventory, self.relationship).all()
# Apply queryset optimizations
qs = qs.select_related(*HostAccess.select_related).prefetch_related(*HostAccess.prefetch_related)
return qs
filter_read_permission = False
class HostGroupsList(SubListCreateAttachDetachAPIView):
@@ -2581,16 +2569,7 @@ class JobTemplateCredentialsList(SubListCreateAttachDetachAPIView):
serializer_class = serializers.CredentialSerializer
parent_model = models.JobTemplate
relationship = 'credentials'
def get_queryset(self):
# Return the full list of credentials
parent = self.get_parent_object()
self.check_parent_access(parent)
sublist_qs = getattrd(parent, self.relationship)
sublist_qs = sublist_qs.prefetch_related(
'created_by', 'modified_by', 'admin_role', 'use_role', 'read_role', 'admin_role__parents', 'admin_role__members'
)
return sublist_qs
filter_read_permission = False
def is_valid_relation(self, parent, sub, created=False):
if sub.unique_hash() in [cred.unique_hash() for cred in parent.credentials.all()]:
@@ -2692,7 +2671,10 @@ class JobTemplateCallback(GenericAPIView):
# Permission class should have already validated host_config_key.
job_template = self.get_object()
# Attempt to find matching hosts based on remote address.
matching_hosts = self.find_matching_hosts()
if job_template.inventory:
matching_hosts = self.find_matching_hosts()
else:
return Response({"msg": _("Cannot start automatically, an inventory is required.")}, status=status.HTTP_400_BAD_REQUEST)
# If the host is not found, update the inventory before trying to
# match again.
inventory_sources_already_updated = []
@@ -2777,6 +2759,7 @@ class JobTemplateInstanceGroupsList(SubListAttachDetachAPIView):
serializer_class = serializers.InstanceGroupSerializer
parent_model = models.JobTemplate
relationship = 'instance_groups'
filter_read_permission = False
class JobTemplateAccessList(ResourceAccessList):
@@ -2867,16 +2850,7 @@ class WorkflowJobTemplateNodeChildrenBaseList(EnforceParentRelationshipMixin, Su
relationship = ''
enforce_parent_relationship = 'workflow_job_template'
search_fields = ('unified_job_template__name', 'unified_job_template__description')
'''
Limit the set of WorkflowJobTemplateNodes to the related nodes of specified by
'relationship'
'''
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
return getattr(parent, self.relationship).all()
filter_read_permission = False
def is_valid_relation(self, parent, sub, created=False):
if created:
@@ -2951,14 +2925,7 @@ class WorkflowJobNodeChildrenBaseList(SubListAPIView):
parent_model = models.WorkflowJobNode
relationship = ''
search_fields = ('unified_job_template__name', 'unified_job_template__description')
#
# Limit the set of WorkflowJobNodes to the related nodes of specified by self.relationship
#
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
return getattr(parent, self.relationship).all()
filter_read_permission = False
class WorkflowJobNodeSuccessNodesList(WorkflowJobNodeChildrenBaseList):
@@ -3137,11 +3104,8 @@ class WorkflowJobTemplateWorkflowNodesList(SubListCreateAPIView):
relationship = 'workflow_job_template_nodes'
parent_key = 'workflow_job_template'
search_fields = ('unified_job_template__name', 'unified_job_template__description')
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
return getattr(parent, self.relationship).order_by('id')
ordering = ('id',) # assure ordering by id for consistency
filter_read_permission = False
class WorkflowJobTemplateJobsList(SubListAPIView):
@@ -3233,11 +3197,8 @@ class WorkflowJobWorkflowNodesList(SubListAPIView):
relationship = 'workflow_job_nodes'
parent_key = 'workflow_job'
search_fields = ('unified_job_template__name', 'unified_job_template__description')
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
return getattr(parent, self.relationship).order_by('id')
ordering = ('id',) # assure ordering by id for consistency
filter_read_permission = False
class WorkflowJobCancel(GenericCancelView):
@@ -3551,11 +3512,7 @@ class BaseJobHostSummariesList(SubListAPIView):
relationship = 'job_host_summaries'
name = _('Job Host Summaries List')
search_fields = ('host_name',)
def get_queryset(self):
parent = self.get_parent_object()
self.check_parent_access(parent)
return getattr(parent, self.relationship).select_related('job', 'job__job_template', 'host')
filter_read_permission = False
class HostJobHostSummariesList(BaseJobHostSummariesList):

View File

@@ -61,12 +61,6 @@ class OrganizationList(OrganizationCountsMixin, ListCreateAPIView):
model = Organization
serializer_class = OrganizationSerializer
def get_queryset(self):
qs = Organization.accessible_objects(self.request.user, 'read_role')
qs = qs.select_related('admin_role', 'auditor_role', 'member_role', 'read_role')
qs = qs.prefetch_related('created_by', 'modified_by')
return qs
class OrganizationDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
model = Organization
@@ -207,6 +201,7 @@ class OrganizationInstanceGroupsList(SubListAttachDetachAPIView):
serializer_class = InstanceGroupSerializer
parent_model = Organization
relationship = 'instance_groups'
filter_read_permission = False
class OrganizationGalaxyCredentialsList(SubListAttachDetachAPIView):
@@ -214,6 +209,7 @@ class OrganizationGalaxyCredentialsList(SubListAttachDetachAPIView):
serializer_class = CredentialSerializer
parent_model = Organization
relationship = 'galaxy_credentials'
filter_read_permission = False
def is_valid_relation(self, parent, sub, created=False):
if sub.kind != 'galaxy_api_token':

View File

@@ -2952,3 +2952,19 @@ class WorkflowApprovalTemplateAccess(BaseAccess):
for cls in BaseAccess.__subclasses__():
access_registry[cls.model] = cls
access_registry[UnpartitionedJobEvent] = UnpartitionedJobEventAccess
def optimize_queryset(queryset):
"""
A utility method in case you already have a queryset and just want to
apply the standard optimizations for that model.
In other words, use if you do not want to start from filtered_queryset for some reason.
"""
if not queryset.model or queryset.model not in access_registry:
return queryset
access_class = access_registry[queryset.model]
if access_class.select_related:
queryset = queryset.select_related(*access_class.select_related)
if access_class.prefetch_related:
queryset = queryset.prefetch_related(*access_class.prefetch_related)
return queryset

View File

@@ -399,7 +399,10 @@ def _copy_table(table, query, path):
file_path = os.path.join(path, table + '_table.csv')
file = FileSplitter(filespec=file_path)
with connection.cursor() as cursor:
cursor.copy_expert(query, file)
with cursor.copy(query) as copy:
while data := copy.read():
byte_data = bytes(data)
file.write(byte_data.decode())
return file.file_list()

View File

@@ -9,6 +9,7 @@ from django.conf import settings
from django.utils.functional import cached_property
from django.utils.timezone import now as tz_now
from django.db import transaction, connection as django_connection
from django.db.utils import DataError
from django_guid import set_guid
import psutil
@@ -191,10 +192,16 @@ class CallbackBrokerWorker(BaseWorker):
e._retry_count = retry_count
# special sanitization logic for postgres treatment of NUL 0x00 char
if (retry_count == 1) and isinstance(exc_indv, ValueError) and ("\x00" in e.stdout):
e.stdout = e.stdout.replace("\x00", "")
if retry_count >= self.INDIVIDUAL_EVENT_RETRIES:
if (retry_count == 1) and isinstance(exc_indv, DataError):
# The easiest place is in stdout. This raises as an error stating that it can't save a NUL character
if "\x00" in e.stdout:
e.stdout = e.stdout.replace("\x00", "")
# There is also a chance that NUL char is embedded in event data which is part of a JSON blob. In that case we, thankfully, get a different exception
if 'unsupported Unicode escape sequence' in str(exc_indv):
e.event_data = json.loads(
json.dumps(e.event_data).replace("\x00", "").replace("\\x00", "").replace("\u0000", "").replace("\\u0000", "")
)
elif retry_count >= self.INDIVIDUAL_EVENT_RETRIES:
logger.error(f'Hit max retries ({retry_count}) saving individual Event error: {str(exc_indv)}\ndata:\n{e.__dict__}')
events.remove(e)
else:

View File

@@ -2,6 +2,8 @@ import json
import logging
import os
import time
import signal
import sys
from django.core.management.base import BaseCommand
from django.conf import settings
@@ -50,6 +52,11 @@ class Command(BaseCommand):
}
return json.dumps(payload)
def notify_listener_and_exit(self, *args):
with pg_bus_conn(new_connection=False) as conn:
conn.notify('web_heartbeet', self.construct_payload(action='offline'))
sys.exit(0)
def do_hearbeat_loop(self):
with pg_bus_conn(new_connection=True) as conn:
while True:
@@ -57,10 +64,10 @@ class Command(BaseCommand):
conn.notify('web_heartbeet', self.construct_payload())
time.sleep(settings.BROADCAST_WEBSOCKET_BEACON_FROM_WEB_RATE_SECONDS)
# TODO: Send a message with action=offline if we notice a SIGTERM or SIGINT
# (wsrelay can use this to remove the node quicker)
def handle(self, *arg, **options):
self.print_banner()
signal.signal(signal.SIGTERM, self.notify_listener_and_exit)
signal.signal(signal.SIGINT, self.notify_listener_and_exit)
# Note: We don't really try any reconnect logic to pg_notify here,
# just let supervisor restart if we fail.

View File

@@ -2,9 +2,6 @@
# Python
from __future__ import unicode_literals
# Psycopg2
from psycopg2.extensions import AsIs
# Django
from django.db import connection, migrations, models, OperationalError, ProgrammingError
from django.conf import settings
@@ -136,8 +133,8 @@ class Migration(migrations.Migration):
),
),
migrations.RunSQL(
[("CREATE INDEX host_ansible_facts_default_gin ON %s USING gin" "(ansible_facts jsonb_path_ops);", [AsIs(Host._meta.db_table)])],
[('DROP INDEX host_ansible_facts_default_gin;', None)],
sql="CREATE INDEX host_ansible_facts_default_gin ON {} USING gin(ansible_facts jsonb_path_ops);".format(Host._meta.db_table),
reverse_sql='DROP INDEX host_ansible_facts_default_gin;',
),
# SCM file-based inventories
migrations.AddField(

View File

@@ -22,10 +22,8 @@ def migrate_event_data(apps, schema_editor):
# recreate counter for the new table's primary key to
# start where the *old* table left off (we have to do this because the
# counter changed from an int to a bigint)
cursor.execute(f'DROP SEQUENCE IF EXISTS "{tblname}_id_seq" CASCADE;')
cursor.execute(f'CREATE SEQUENCE "{tblname}_id_seq";')
cursor.execute(f'ALTER TABLE "{tblname}" ALTER COLUMN "id" ' f"SET DEFAULT nextval('{tblname}_id_seq');")
cursor.execute(f"SELECT setval('{tblname}_id_seq', (SELECT MAX(id) FROM _old_{tblname}), true);")
cursor.execute(f'CREATE SEQUENCE IF NOT EXISTS "{tblname}_id_seq";')
cursor.execute(f"SELECT setval('{tblname}_id_seq', COALESCE((SELECT MAX(id)+1 FROM _old_{tblname}), 1), false);")
cursor.execute(f'DROP TABLE _old_{tblname};')

View File

@@ -0,0 +1,30 @@
# Generated by Django 3.2.16 on 2023-04-21 14:15
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.OAUTH2_PROVIDER_ID_TOKEN_MODEL),
('main', '0182_constructed_inventory'),
('oauth2_provider', '0005_auto_20211222_2352'),
]
operations = [
migrations.AddField(
model_name='oauth2accesstoken',
name='id_token',
field=models.OneToOneField(
blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='access_token', to=settings.OAUTH2_PROVIDER_ID_TOKEN_MODEL
),
),
migrations.AddField(
model_name='oauth2application',
name='algorithm',
field=models.CharField(
blank=True, choices=[('', 'No OIDC support'), ('RS256', 'RSA with SHA-2 256'), ('HS256', 'HMAC with SHA-2 256')], default='', max_length=5
),
),
]

View File

@@ -0,0 +1,972 @@
# Generated by Django 4.2 on 2023-04-21 14:43
import awx.main.fields
import awx.main.utils.polymorphic
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('main', '0183_pre_django_upgrade'),
]
operations = [
migrations.AlterField(
model_name='activitystream',
name='unified_job',
field=models.ManyToManyField(blank=True, related_name='activity_stream_as_unified_job+', to='main.unifiedjob'),
),
migrations.AlterField(
model_name='activitystream',
name='unified_job_template',
field=models.ManyToManyField(blank=True, related_name='activity_stream_as_unified_job_template+', to='main.unifiedjobtemplate'),
),
migrations.AlterField(
model_name='credential',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='credential',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='credentialinputsource',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='credentialinputsource',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='credentialtype',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='credentialtype',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='custominventoryscript',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='custominventoryscript',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='executionenvironment',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='executionenvironment',
name='credential',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.credential'
),
),
migrations.AlterField(
model_name='executionenvironment',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='executionenvironment',
name='organization',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The organization used to determine access to this execution environment.',
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name='%(class)ss',
to='main.organization',
),
),
migrations.AlterField(
model_name='group',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='group',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='host',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='host',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='host',
name='smart_inventories',
field=models.ManyToManyField(related_name='+', through='main.SmartInventoryMembership', to='main.inventory'),
),
migrations.AlterField(
model_name='instancegroup',
name='credential',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.credential'
),
),
migrations.AlterField(
model_name='inventory',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='inventory',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='job',
name='inventory',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.inventory'
),
),
migrations.AlterField(
model_name='job',
name='project',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.project'
),
),
migrations.AlterField(
model_name='job',
name='webhook_credential',
field=models.ForeignKey(
blank=True,
help_text='Personal Access Token for posting back the status to the service API',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.credential',
),
),
migrations.AlterField(
model_name='joblaunchconfig',
name='credentials',
field=models.ManyToManyField(related_name='%(class)ss', to='main.credential'),
),
migrations.AlterField(
model_name='joblaunchconfig',
name='execution_environment',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The container image to be used for execution.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)s_as_prompt',
to='main.executionenvironment',
),
),
migrations.AlterField(
model_name='joblaunchconfig',
name='instance_groups',
field=awx.main.fields.OrderedManyToManyField(
blank=True, editable=False, related_name='%(class)ss', through='main.JobLaunchConfigInstanceGroupMembership', to='main.instancegroup'
),
),
migrations.AlterField(
model_name='joblaunchconfig',
name='inventory',
field=models.ForeignKey(
blank=True,
default=None,
help_text='Inventory applied as a prompt, assuming job template prompts for inventory',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.inventory',
),
),
migrations.AlterField(
model_name='joblaunchconfig',
name='labels',
field=models.ManyToManyField(related_name='%(class)s_labels', to='main.label'),
),
migrations.AlterField(
model_name='jobtemplate',
name='inventory',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.inventory'
),
),
migrations.AlterField(
model_name='jobtemplate',
name='project',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.project'
),
),
migrations.AlterField(
model_name='jobtemplate',
name='webhook_credential',
field=models.ForeignKey(
blank=True,
help_text='Personal Access Token for posting back the status to the service API',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.credential',
),
),
migrations.AlterField(
model_name='label',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='label',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='notificationtemplate',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='notificationtemplate',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='oauth2accesstoken',
name='user',
field=models.ForeignKey(
blank=True,
help_text='The user representing the token owner',
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name='%(app_label)s_%(class)s',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='oauth2application',
name='user',
field=models.ForeignKey(
blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='%(app_label)s_%(class)s', to=settings.AUTH_USER_MODEL
),
),
migrations.AlterField(
model_name='organization',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='organization',
name='galaxy_credentials',
field=awx.main.fields.OrderedManyToManyField(
blank=True, related_name='%(class)s_galaxy_credentials', through='main.OrganizationGalaxyCredentialMembership', to='main.credential'
),
),
migrations.AlterField(
model_name='organization',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='organization',
name='notification_templates_approvals',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_approvals', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='organization',
name='notification_templates_error',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_errors', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='organization',
name='notification_templates_started',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_started', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='organization',
name='notification_templates_success',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_success', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='project',
name='credential',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.credential'
),
),
migrations.AlterField(
model_name='project',
name='signature_validation_credential',
field=models.ForeignKey(
blank=True,
default=None,
help_text='An optional credential used for validating files in the project against unexpected changes.',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss_signature_validation',
to='main.credential',
),
),
migrations.AlterField(
model_name='projectupdate',
name='credential',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.credential'
),
),
migrations.AlterField(
model_name='schedule',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='schedule',
name='credentials',
field=models.ManyToManyField(related_name='%(class)ss', to='main.credential'),
),
migrations.AlterField(
model_name='schedule',
name='execution_environment',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The container image to be used for execution.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)s_as_prompt',
to='main.executionenvironment',
),
),
migrations.AlterField(
model_name='schedule',
name='inventory',
field=models.ForeignKey(
blank=True,
default=None,
help_text='Inventory applied as a prompt, assuming job template prompts for inventory',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.inventory',
),
),
migrations.AlterField(
model_name='schedule',
name='labels',
field=models.ManyToManyField(related_name='%(class)s_labels', to='main.label'),
),
migrations.AlterField(
model_name='schedule',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='team',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='team',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='unifiedjob',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='unifiedjob',
name='credentials',
field=models.ManyToManyField(related_name='%(class)ss', to='main.credential'),
),
migrations.AlterField(
model_name='unifiedjob',
name='dependent_jobs',
field=models.ManyToManyField(editable=False, related_name='%(class)s_blocked_jobs', to='main.unifiedjob'),
),
migrations.AlterField(
model_name='unifiedjob',
name='execution_environment',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The container image to be used for execution.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)ss',
to='main.executionenvironment',
),
),
migrations.AlterField(
model_name='unifiedjob',
name='labels',
field=models.ManyToManyField(blank=True, related_name='%(class)s_labels', to='main.label'),
),
migrations.AlterField(
model_name='unifiedjob',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='unifiedjob',
name='notifications',
field=models.ManyToManyField(editable=False, related_name='%(class)s_notifications', to='main.notification'),
),
migrations.AlterField(
model_name='unifiedjob',
name='organization',
field=models.ForeignKey(
blank=True,
help_text='The organization used to determine access to this unified job.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)ss',
to='main.organization',
),
),
migrations.AlterField(
model_name='unifiedjob',
name='polymorphic_ctype',
field=models.ForeignKey(
editable=False,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name='polymorphic_%(app_label)s.%(class)s_set+',
to='contenttypes.contenttype',
),
),
migrations.AlterField(
model_name='unifiedjob',
name='unified_job_template',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)s_unified_jobs',
to='main.unifiedjobtemplate',
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='created_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_created+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='credentials',
field=models.ManyToManyField(related_name='%(class)ss', to='main.credential'),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='current_job',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)s_as_current_job+',
to='main.unifiedjob',
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='execution_environment',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The container image to be used for execution.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)ss',
to='main.executionenvironment',
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='labels',
field=models.ManyToManyField(blank=True, related_name='%(class)s_labels', to='main.label'),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='last_job',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)s_as_last_job+',
to='main.unifiedjob',
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='modified_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_modified+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='next_schedule',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)s_as_next_schedule+',
to='main.schedule',
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='notification_templates_error',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_errors', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='notification_templates_started',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_started', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='notification_templates_success',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_success', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='organization',
field=models.ForeignKey(
blank=True,
help_text='The organization used to determine access to this template.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)ss',
to='main.organization',
),
),
migrations.AlterField(
model_name='unifiedjobtemplate',
name='polymorphic_ctype',
field=models.ForeignKey(
editable=False,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name='polymorphic_%(app_label)s.%(class)s_set+',
to='contenttypes.contenttype',
),
),
migrations.AlterField(
model_name='workflowapproval',
name='approved_or_denied_by',
field=models.ForeignKey(
default=None,
editable=False,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%s(class)s_approved+',
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name='workflowjob',
name='inventory',
field=models.ForeignKey(
blank=True,
default=None,
help_text='Inventory applied as a prompt, assuming job template prompts for inventory',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.inventory',
),
),
migrations.AlterField(
model_name='workflowjob',
name='webhook_credential',
field=models.ForeignKey(
blank=True,
help_text='Personal Access Token for posting back the status to the service API',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.credential',
),
),
migrations.AlterField(
model_name='workflowjobnode',
name='always_nodes',
field=models.ManyToManyField(blank=True, related_name='%(class)ss_always', to='main.workflowjobnode'),
),
migrations.AlterField(
model_name='workflowjobnode',
name='credentials',
field=models.ManyToManyField(related_name='%(class)ss', to='main.credential'),
),
migrations.AlterField(
model_name='workflowjobnode',
name='execution_environment',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The container image to be used for execution.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)s_as_prompt',
to='main.executionenvironment',
),
),
migrations.AlterField(
model_name='workflowjobnode',
name='failure_nodes',
field=models.ManyToManyField(blank=True, related_name='%(class)ss_failure', to='main.workflowjobnode'),
),
migrations.AlterField(
model_name='workflowjobnode',
name='inventory',
field=models.ForeignKey(
blank=True,
default=None,
help_text='Inventory applied as a prompt, assuming job template prompts for inventory',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.inventory',
),
),
migrations.AlterField(
model_name='workflowjobnode',
name='labels',
field=models.ManyToManyField(related_name='%(class)s_labels', to='main.label'),
),
migrations.AlterField(
model_name='workflowjobnode',
name='success_nodes',
field=models.ManyToManyField(blank=True, related_name='%(class)ss_success', to='main.workflowjobnode'),
),
migrations.AlterField(
model_name='workflowjobnode',
name='unified_job_template',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.unifiedjobtemplate'
),
),
migrations.AlterField(
model_name='workflowjobtemplate',
name='inventory',
field=models.ForeignKey(
blank=True,
default=None,
help_text='Inventory applied as a prompt, assuming job template prompts for inventory',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.inventory',
),
),
migrations.AlterField(
model_name='workflowjobtemplate',
name='notification_templates_approvals',
field=models.ManyToManyField(blank=True, related_name='%(class)s_notification_templates_for_approvals', to='main.notificationtemplate'),
),
migrations.AlterField(
model_name='workflowjobtemplate',
name='webhook_credential',
field=models.ForeignKey(
blank=True,
help_text='Personal Access Token for posting back the status to the service API',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.credential',
),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='always_nodes',
field=models.ManyToManyField(blank=True, related_name='%(class)ss_always', to='main.workflowjobtemplatenode'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='credentials',
field=models.ManyToManyField(related_name='%(class)ss', to='main.credential'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='execution_environment',
field=models.ForeignKey(
blank=True,
default=None,
help_text='The container image to be used for execution.',
null=True,
on_delete=awx.main.utils.polymorphic.SET_NULL,
related_name='%(class)s_as_prompt',
to='main.executionenvironment',
),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='failure_nodes',
field=models.ManyToManyField(blank=True, related_name='%(class)ss_failure', to='main.workflowjobtemplatenode'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='inventory',
field=models.ForeignKey(
blank=True,
default=None,
help_text='Inventory applied as a prompt, assuming job template prompts for inventory',
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='%(class)ss',
to='main.inventory',
),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='labels',
field=models.ManyToManyField(related_name='%(class)s_labels', to='main.label'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='success_nodes',
field=models.ManyToManyField(blank=True, related_name='%(class)ss_success', to='main.workflowjobtemplatenode'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='unified_job_template',
field=models.ForeignKey(
blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)ss', to='main.unifiedjobtemplate'
),
),
]

View File

@@ -0,0 +1,102 @@
# Generated by Django 4.2 on 2023-04-28 19:21
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('main', '0184_django_upgrade'),
]
operations = [
migrations.RenameIndex(
model_name='adhoccommandevent',
new_name='main_adhocc_ad_hoc__a57777_idx',
old_fields=('ad_hoc_command', 'job_created', 'counter'),
),
migrations.RenameIndex(
model_name='adhoccommandevent',
new_name='main_adhocc_ad_hoc__e72142_idx',
old_fields=('ad_hoc_command', 'job_created', 'event'),
),
migrations.RenameIndex(
model_name='adhoccommandevent',
new_name='main_adhocc_ad_hoc__1e4d24_idx',
old_fields=('ad_hoc_command', 'job_created', 'uuid'),
),
migrations.RenameIndex(
model_name='inventoryupdateevent',
new_name='main_invent_invento_f72b21_idx',
old_fields=('inventory_update', 'job_created', 'uuid'),
),
migrations.RenameIndex(
model_name='inventoryupdateevent',
new_name='main_invent_invento_364dcb_idx',
old_fields=('inventory_update', 'job_created', 'counter'),
),
migrations.RenameIndex(
model_name='jobevent',
new_name='main_jobeve_job_id_51c382_idx',
old_fields=('job', 'job_created', 'counter'),
),
migrations.RenameIndex(
model_name='jobevent',
new_name='main_jobeve_job_id_0ddc6b_idx',
old_fields=('job', 'job_created', 'event'),
),
migrations.RenameIndex(
model_name='jobevent',
new_name='main_jobeve_job_id_40a56d_idx',
old_fields=('job', 'job_created', 'parent_uuid'),
),
migrations.RenameIndex(
model_name='jobevent',
new_name='main_jobeve_job_id_3c4a4a_idx',
old_fields=('job', 'job_created', 'uuid'),
),
migrations.RenameIndex(
model_name='projectupdateevent',
new_name='main_projec_project_c44b7c_idx',
old_fields=('project_update', 'job_created', 'event'),
),
migrations.RenameIndex(
model_name='projectupdateevent',
new_name='main_projec_project_449bbd_idx',
old_fields=('project_update', 'job_created', 'uuid'),
),
migrations.RenameIndex(
model_name='projectupdateevent',
new_name='main_projec_project_69559a_idx',
old_fields=('project_update', 'job_created', 'counter'),
),
migrations.RenameIndex(
model_name='role',
new_name='main_rbac_r_content_979bdd_idx',
old_fields=('content_type', 'object_id'),
),
migrations.RenameIndex(
model_name='roleancestorentry',
new_name='main_rbac_r_ancesto_22b9f0_idx',
old_fields=('ancestor', 'content_type_id', 'object_id'),
),
migrations.RenameIndex(
model_name='roleancestorentry',
new_name='main_rbac_r_ancesto_b44606_idx',
old_fields=('ancestor', 'content_type_id', 'role_field'),
),
migrations.RenameIndex(
model_name='roleancestorentry',
new_name='main_rbac_r_ancesto_c87b87_idx',
old_fields=('ancestor', 'descendent'),
),
migrations.RenameIndex(
model_name='systemjobevent',
new_name='main_system_system__e39825_idx',
old_fields=('system_job', 'job_created', 'uuid'),
),
migrations.RenameIndex(
model_name='systemjobevent',
new_name='main_system_system__73537a_idx',
old_fields=('system_job', 'job_created', 'counter'),
),
]

View File

@@ -8,7 +8,7 @@ logger = logging.getLogger('awx.main.migrations')
def migrate_org_admin_to_use(apps, schema_editor):
logger.info('Initiated migration from Org admin to use role')
roles_added = 0
for org in Organization.objects.prefetch_related('admin_role__members').iterator():
for org in Organization.objects.prefetch_related('admin_role__members').iterator(chunk_size=1000):
igs = list(org.instance_groups.all())
if not igs:
continue

View File

@@ -195,6 +195,9 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
@cached_property
def dynamic_input_fields(self):
# if the credential is not yet saved we can't access the input_sources
if not self.id:
return []
return [obj.input_field_name for obj in self.input_sources.all()]
def _password_field_allows_ask(self, field):

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
import datetime
from datetime import timezone
import logging
from collections import defaultdict
@@ -10,7 +11,7 @@ from django.db import models, DatabaseError
from django.db.models.functions import Cast
from django.utils.dateparse import parse_datetime
from django.utils.text import Truncator
from django.utils.timezone import utc, now
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from django.utils.encoding import force_str
@@ -422,7 +423,7 @@ class BasePlaybookEvent(CreatedModifiedModel):
if not isinstance(kwargs['created'], datetime.datetime):
kwargs['created'] = parse_datetime(kwargs['created'])
if not kwargs['created'].tzinfo:
kwargs['created'] = kwargs['created'].replace(tzinfo=utc)
kwargs['created'] = kwargs['created'].replace(tzinfo=timezone.utc)
except (KeyError, ValueError):
kwargs.pop('created', None)
@@ -432,7 +433,7 @@ class BasePlaybookEvent(CreatedModifiedModel):
if not isinstance(kwargs['job_created'], datetime.datetime):
kwargs['job_created'] = parse_datetime(kwargs['job_created'])
if not kwargs['job_created'].tzinfo:
kwargs['job_created'] = kwargs['job_created'].replace(tzinfo=utc)
kwargs['job_created'] = kwargs['job_created'].replace(tzinfo=timezone.utc)
except (KeyError, ValueError):
kwargs.pop('job_created', None)
@@ -470,11 +471,11 @@ class JobEvent(BasePlaybookEvent):
class Meta:
app_label = 'main'
ordering = ('pk',)
index_together = [
('job', 'job_created', 'event'),
('job', 'job_created', 'uuid'),
('job', 'job_created', 'parent_uuid'),
('job', 'job_created', 'counter'),
indexes = [
models.Index(fields=['job', 'job_created', 'event']),
models.Index(fields=['job', 'job_created', 'uuid']),
models.Index(fields=['job', 'job_created', 'parent_uuid']),
models.Index(fields=['job', 'job_created', 'counter']),
]
id = models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')
@@ -632,10 +633,10 @@ class ProjectUpdateEvent(BasePlaybookEvent):
class Meta:
app_label = 'main'
ordering = ('pk',)
index_together = [
('project_update', 'job_created', 'event'),
('project_update', 'job_created', 'uuid'),
('project_update', 'job_created', 'counter'),
indexes = [
models.Index(fields=['project_update', 'job_created', 'event']),
models.Index(fields=['project_update', 'job_created', 'uuid']),
models.Index(fields=['project_update', 'job_created', 'counter']),
]
id = models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')
@@ -734,7 +735,7 @@ class BaseCommandEvent(CreatedModifiedModel):
if not isinstance(kwargs['created'], datetime.datetime):
kwargs['created'] = parse_datetime(kwargs['created'])
if not kwargs['created'].tzinfo:
kwargs['created'] = kwargs['created'].replace(tzinfo=utc)
kwargs['created'] = kwargs['created'].replace(tzinfo=timezone.utc)
except (KeyError, ValueError):
kwargs.pop('created', None)
@@ -770,10 +771,10 @@ class AdHocCommandEvent(BaseCommandEvent):
class Meta:
app_label = 'main'
ordering = ('-pk',)
index_together = [
('ad_hoc_command', 'job_created', 'event'),
('ad_hoc_command', 'job_created', 'uuid'),
('ad_hoc_command', 'job_created', 'counter'),
indexes = [
models.Index(fields=['ad_hoc_command', 'job_created', 'event']),
models.Index(fields=['ad_hoc_command', 'job_created', 'uuid']),
models.Index(fields=['ad_hoc_command', 'job_created', 'counter']),
]
EVENT_TYPES = [
@@ -875,9 +876,9 @@ class InventoryUpdateEvent(BaseCommandEvent):
class Meta:
app_label = 'main'
ordering = ('-pk',)
index_together = [
('inventory_update', 'job_created', 'uuid'),
('inventory_update', 'job_created', 'counter'),
indexes = [
models.Index(fields=['inventory_update', 'job_created', 'uuid']),
models.Index(fields=['inventory_update', 'job_created', 'counter']),
]
id = models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')
@@ -920,9 +921,9 @@ class SystemJobEvent(BaseCommandEvent):
class Meta:
app_label = 'main'
ordering = ('-pk',)
index_together = [
('system_job', 'job_created', 'uuid'),
('system_job', 'job_created', 'counter'),
indexes = [
models.Index(fields=['system_job', 'job_created', 'uuid']),
models.Index(fields=['system_job', 'job_created', 'counter']),
]
id = models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')

View File

@@ -1479,8 +1479,6 @@ class PluginFileInjector(object):
def build_env(self, inventory_update, env, private_data_dir, private_data_files):
injector_env = self.get_plugin_env(inventory_update, private_data_dir, private_data_files)
env.update(injector_env)
# Preserves current behavior for Ansible change in default planned for 2.10
env['ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS'] = 'never'
# All CLOUD_PROVIDERS sources implement as inventory plugin from collection
env['ANSIBLE_INVENTORY_ENABLED'] = 'auto'
return env

View File

@@ -141,7 +141,7 @@ class Role(models.Model):
app_label = 'main'
verbose_name_plural = _('roles')
db_table = 'main_rbac_roles'
index_together = [("content_type", "object_id")]
indexes = [models.Index(fields=["content_type", "object_id"])]
ordering = ("content_type", "object_id")
role_field = models.TextField(null=False)
@@ -447,10 +447,10 @@ class RoleAncestorEntry(models.Model):
app_label = 'main'
verbose_name_plural = _('role_ancestors')
db_table = 'main_rbac_role_ancestors'
index_together = [
("ancestor", "content_type_id", "object_id"), # used by get_roles_on_resource
("ancestor", "content_type_id", "role_field"), # used by accessible_objects
("ancestor", "descendent"), # used by rebuild_role_ancestor_list in the NOT EXISTS clauses.
indexes = [
models.Index(fields=["ancestor", "content_type_id", "object_id"]), # used by get_roles_on_resource
models.Index(fields=["ancestor", "content_type_id", "role_field"]), # used by accessible_objects
models.Index(fields=["ancestor", "descendent"]), # used by rebuild_role_ancestor_list in the NOT EXISTS clauses.
]
descendent = models.ForeignKey(Role, null=False, on_delete=models.CASCADE, related_name='+')

View File

@@ -1137,11 +1137,9 @@ class UnifiedJob(
if total > max_supported:
raise StdoutMaxBytesExceeded(total, max_supported)
# psycopg2's copy_expert writes bytes, but callers of this
# psycopg3's copy writes bytes, but callers of this
# function assume a str-based fd will be returned; decode
# .write() calls on the fly to maintain this interface
_write = fd.write
fd.write = lambda s: _write(smart_str(s))
tbl = self._meta.db_table + 'event'
created_by_cond = ''
if self.has_unpartitioned_events:
@@ -1150,7 +1148,9 @@ class UnifiedJob(
created_by_cond = f"job_created='{self.created.isoformat()}' AND "
sql = f"copy (select stdout from {tbl} where {created_by_cond}{self.event_parent_key}={self.id} and stdout != '' order by start_line) to stdout" # nosql
cursor.copy_expert(sql, fd)
with cursor.copy(sql) as copy:
while data := copy.read():
fd.write(smart_str(bytes(data)))
if hasattr(fd, 'name'):
# If we're dealing with a physical file, use `sed` to clean

View File

@@ -893,15 +893,8 @@ def _reconstruct_relationships(copy_mapping):
@task(queue=get_task_queuename)
def deep_copy_model_obj(model_module, model_name, obj_pk, new_obj_pk, user_pk, uuid, permission_check_func=None):
sub_obj_list = cache.get(uuid)
if sub_obj_list is None:
logger.error('Deep copy {} from {} to {} failed unexpectedly.'.format(model_name, obj_pk, new_obj_pk))
return
def deep_copy_model_obj(model_module, model_name, obj_pk, new_obj_pk, user_pk, permission_check_func=None):
logger.debug('Deep copy {} from {} to {}.'.format(model_name, obj_pk, new_obj_pk))
from awx.api.generics import CopyAPIView
from awx.main.signals import disable_activity_stream
model = getattr(importlib.import_module(model_module), model_name, None)
if model is None:
@@ -913,6 +906,28 @@ def deep_copy_model_obj(model_module, model_name, obj_pk, new_obj_pk, user_pk, u
except ObjectDoesNotExist:
logger.warning("Object or user no longer exists.")
return
o2m_to_preserve = {}
fields_to_preserve = set(getattr(model, 'FIELDS_TO_PRESERVE_AT_COPY', []))
for field in model._meta.get_fields():
if field.name in fields_to_preserve:
if field.one_to_many:
try:
field_val = getattr(obj, field.name)
except AttributeError:
continue
o2m_to_preserve[field.name] = field_val
sub_obj_list = []
for o2m in o2m_to_preserve:
for sub_obj in o2m_to_preserve[o2m].all():
sub_model = type(sub_obj)
sub_obj_list.append((sub_model.__module__, sub_model.__name__, sub_obj.pk))
from awx.api.generics import CopyAPIView
from awx.main.signals import disable_activity_stream
with transaction.atomic(), ignore_inventory_computed_fields(), disable_activity_stream():
copy_mapping = {}
for sub_obj_setup in sub_obj_list:

View File

@@ -1,9 +1,8 @@
{
"ANSIBLE_JINJA2_NATIVE": "True",
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"AZURE_CLIENT_ID": "fooo",
"AZURE_CLOUD_ENVIRONMENT": "fooo",
"AZURE_SECRET": "fooo",
"AZURE_SUBSCRIPTION_ID": "fooo",
"AZURE_TENANT": "fooo"
}
}

View File

@@ -1,5 +1,4 @@
{
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"TOWER_HOST": "https://foo.invalid",
"TOWER_PASSWORD": "fooo",
"TOWER_USERNAME": "fooo",
@@ -10,4 +9,4 @@
"CONTROLLER_USERNAME": "fooo",
"CONTROLLER_OAUTH_TOKEN": "",
"CONTROLLER_VERIFY_SSL": "False"
}
}

View File

@@ -1,8 +1,7 @@
{
"ANSIBLE_JINJA2_NATIVE": "True",
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"AWS_ACCESS_KEY_ID": "fooo",
"AWS_SECRET_ACCESS_KEY": "fooo",
"AWS_SECURITY_TOKEN": "fooo",
"AWS_SESSION_TOKEN": "fooo"
}
}

View File

@@ -1,6 +1,5 @@
{
"ANSIBLE_JINJA2_NATIVE": "True",
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"GCE_CREDENTIALS_FILE_PATH": "{{ file_reference }}",
"GOOGLE_APPLICATION_CREDENTIALS": "{{ file_reference }}",
"GCP_AUTH_KIND": "serviceaccount",

View File

@@ -1,5 +1,4 @@
{
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"INSIGHTS_USER": "fooo",
"INSIGHTS_PASSWORD": "fooo"
}
}

View File

@@ -1,4 +1,3 @@
{
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"OS_CLIENT_CONFIG_FILE": "{{ file_reference }}"
}
}

View File

@@ -1,7 +1,6 @@
{
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"OVIRT_INI_PATH": "{{ file_reference }}",
"OVIRT_PASSWORD": "fooo",
"OVIRT_URL": "https://foo.invalid",
"OVIRT_USERNAME": "fooo"
}
}

View File

@@ -1,6 +1,5 @@
{
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"FOREMAN_PASSWORD": "fooo",
"FOREMAN_SERVER": "https://foo.invalid",
"FOREMAN_USER": "fooo"
}
}

View File

@@ -1,7 +1,6 @@
{
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"VMWARE_HOST": "https://foo.invalid",
"VMWARE_PASSWORD": "fooo",
"VMWARE_USER": "fooo",
"VMWARE_VALIDATE_CERTS": "False"
}
}

View File

@@ -2,8 +2,8 @@ import pytest
import tempfile
import os
import re
import shutil
import csv
from io import StringIO
from django.utils.timezone import now
from datetime import timedelta
@@ -20,15 +20,16 @@ from awx.main.models import (
)
@pytest.fixture
def sqlite_copy_expert(request):
# copy_expert is postgres-specific, and SQLite doesn't support it; mock its
# behavior to test that it writes a file that contains stdout from events
path = tempfile.mkdtemp(prefix="copied_tables")
class MockCopy:
headers = None
results = None
sent_data = False
def write_stdout(self, sql, fd):
def __init__(self, sql, parent_connection):
# Would be cool if we instead properly disected the SQL query and verified
# it that way. But instead, we just take the naive approach here.
self.results = None
self.headers = None
sql = sql.strip()
assert sql.startswith("COPY (")
assert sql.endswith(") TO STDOUT WITH CSV HEADER")
@@ -51,29 +52,49 @@ def sqlite_copy_expert(request):
elif not line.endswith(","):
sql_new[-1] = sql_new[-1].rstrip(",")
sql = "\n".join(sql_new)
parent_connection.execute(sql)
self.results = parent_connection.fetchall()
self.headers = [i[0] for i in parent_connection.description]
self.execute(sql)
results = self.fetchall()
headers = [i[0] for i in self.description]
def read(self):
if not self.sent_data:
mem_file = StringIO()
csv_handle = csv.writer(
mem_file,
delimiter=",",
quoting=csv.QUOTE_ALL,
escapechar="\\",
lineterminator="\n",
)
if self.headers:
csv_handle.writerow(self.headers)
if self.results:
csv_handle.writerows(self.results)
self.sent_data = True
return memoryview((mem_file.getvalue()).encode())
return None
csv_handle = csv.writer(
fd,
delimiter=",",
quoting=csv.QUOTE_ALL,
escapechar="\\",
lineterminator="\n",
)
csv_handle.writerow(headers)
csv_handle.writerows(results)
def __enter__(self):
return self
setattr(SQLiteCursorWrapper, "copy_expert", write_stdout)
request.addfinalizer(lambda: shutil.rmtree(path))
request.addfinalizer(lambda: delattr(SQLiteCursorWrapper, "copy_expert"))
return path
def __exit__(self, exc_type, exc_val, exc_tb):
pass
@pytest.fixture
def sqlite_copy(request, mocker):
# copy is postgres-specific, and SQLite doesn't support it; mock its
# behavior to test that it writes a file that contains stdout from events
def write_stdout(self, sql):
mock_copy = MockCopy(sql, self)
return mock_copy
mocker.patch.object(SQLiteCursorWrapper, 'copy', write_stdout, create=True)
@pytest.mark.django_db
def test_copy_tables_unified_job_query(sqlite_copy_expert, project, inventory, job_template):
def test_copy_tables_unified_job_query(sqlite_copy, project, inventory, job_template):
"""
Ensure that various unified job types are in the output of the query.
"""
@@ -127,7 +148,7 @@ def workflow_job(states=["new", "new", "new", "new", "new"]):
@pytest.mark.django_db
def test_copy_tables_workflow_job_node_query(sqlite_copy_expert, workflow_job):
def test_copy_tables_workflow_job_node_query(sqlite_copy, workflow_job):
time_start = now() - timedelta(hours=9)
with tempfile.TemporaryDirectory() as tmpdir:

View File

@@ -214,7 +214,7 @@ class TestControllerNode:
return AdHocCommand.objects.create(inventory=inventory)
@pytest.mark.django_db
def test_field_controller_node_exists(self, sqlite_copy_expert, admin_user, job, project_update, inventory_update, adhoc, get, system_job_factory):
def test_field_controller_node_exists(self, sqlite_copy, admin_user, job, project_update, inventory_update, adhoc, get, system_job_factory):
system_job = system_job_factory()
r = get(reverse('api:unified_job_list') + '?id={}'.format(job.id), admin_user, expect=200)

View File

@@ -3,7 +3,7 @@ import pytest
# AWX
from awx.api.serializers import JobTemplateSerializer
from awx.api.versioning import reverse
from awx.main.models import Job, JobTemplate, CredentialType, WorkflowJobTemplate, Organization, Project
from awx.main.models import Job, JobTemplate, CredentialType, WorkflowJobTemplate, Organization, Project, Inventory
from awx.main.migrations import _save_password_keys as save_password_keys
# Django
@@ -353,3 +353,19 @@ def test_job_template_branch_prompt_error(project, inventory, post, admin_user):
expect=400,
)
assert 'Project does not allow overriding branch' in str(r.data['ask_scm_branch_on_launch'])
@pytest.mark.django_db
def test_job_template_missing_inventory(project, inventory, admin_user, post):
jt = JobTemplate.objects.create(
name='test-jt', inventory=inventory, ask_inventory_on_launch=True, project=project, playbook='helloworld.yml', host_config_key='abcd'
)
Inventory.objects.get(pk=inventory.pk).delete()
r = post(
url=reverse('api:job_template_callback', kwargs={'pk': jt.pk}),
data={'host_config_key': 'abcd'},
user=admin_user,
expect=400,
)
assert r.status_code == 400
assert "Cannot start automatically, an inventory is required." in str(r.data)

View File

@@ -329,3 +329,21 @@ def test_galaxy_credential_association(alice, admin, organization, post, get):
'Public Galaxy 4',
'Public Galaxy 5',
]
@pytest.mark.django_db
def test_org_admin_credential_count(org_admin, admin, organization, post, get):
galaxy = CredentialType.defaults['galaxy_api_token']()
galaxy.save()
for i in range(3):
cred = Credential.objects.create(credential_type=galaxy, name=f'test_{i}', inputs={'url': 'https://galaxy.ansible.com/'})
url = reverse('api:organization_galaxy_credentials_list', kwargs={'pk': organization.pk})
post(url, {'associate': True, 'id': cred.pk}, user=admin, expect=204)
# org admin should see all associated galaxy credentials
resp = get(url, user=org_admin)
assert resp.data['count'] == 3
# removing one to validate new count
post(url, {'disassociate': True, 'id': Credential.objects.get(name='test_1').pk}, user=admin, expect=204)
resp_new = get(url, user=org_admin)
assert resp_new.data['count'] == 2

View File

@@ -57,7 +57,7 @@ def _mk_inventory_update(created=None):
[_mk_inventory_update, InventoryUpdateEvent, 'inventory_update', 'api:inventory_update_stdout'],
],
)
def test_text_stdout(sqlite_copy_expert, Parent, Child, relation, view, get, admin):
def test_text_stdout(sqlite_copy, Parent, Child, relation, view, get, admin):
job = Parent()
job.save()
for i in range(3):
@@ -79,7 +79,7 @@ def test_text_stdout(sqlite_copy_expert, Parent, Child, relation, view, get, adm
],
)
@pytest.mark.parametrize('download', [True, False])
def test_ansi_stdout_filtering(sqlite_copy_expert, Parent, Child, relation, view, download, get, admin):
def test_ansi_stdout_filtering(sqlite_copy, Parent, Child, relation, view, download, get, admin):
job = Parent()
job.save()
for i in range(3):
@@ -111,7 +111,7 @@ def test_ansi_stdout_filtering(sqlite_copy_expert, Parent, Child, relation, view
[_mk_inventory_update, InventoryUpdateEvent, 'inventory_update', 'api:inventory_update_stdout'],
],
)
def test_colorized_html_stdout(sqlite_copy_expert, Parent, Child, relation, view, get, admin):
def test_colorized_html_stdout(sqlite_copy, Parent, Child, relation, view, get, admin):
job = Parent()
job.save()
for i in range(3):
@@ -134,7 +134,7 @@ def test_colorized_html_stdout(sqlite_copy_expert, Parent, Child, relation, view
[_mk_inventory_update, InventoryUpdateEvent, 'inventory_update', 'api:inventory_update_stdout'],
],
)
def test_stdout_line_range(sqlite_copy_expert, Parent, Child, relation, view, get, admin):
def test_stdout_line_range(sqlite_copy, Parent, Child, relation, view, get, admin):
job = Parent()
job.save()
for i in range(20):
@@ -146,7 +146,7 @@ def test_stdout_line_range(sqlite_copy_expert, Parent, Child, relation, view, ge
@pytest.mark.django_db
def test_text_stdout_from_system_job_events(sqlite_copy_expert, get, admin):
def test_text_stdout_from_system_job_events(sqlite_copy, get, admin):
created = tz_now()
job = SystemJob(created=created)
job.save()
@@ -158,7 +158,7 @@ def test_text_stdout_from_system_job_events(sqlite_copy_expert, get, admin):
@pytest.mark.django_db
def test_text_stdout_with_max_stdout(sqlite_copy_expert, get, admin):
def test_text_stdout_with_max_stdout(sqlite_copy, get, admin):
created = tz_now()
job = SystemJob(created=created)
job.save()
@@ -185,7 +185,7 @@ def test_text_stdout_with_max_stdout(sqlite_copy_expert, get, admin):
)
@pytest.mark.parametrize('fmt', ['txt', 'ansi'])
@mock.patch('awx.main.redact.UriCleaner.SENSITIVE_URI_PATTERN', mock.Mock(**{'search.return_value': None})) # really slow for large strings
def test_max_bytes_display(sqlite_copy_expert, Parent, Child, relation, view, fmt, get, admin):
def test_max_bytes_display(sqlite_copy, Parent, Child, relation, view, fmt, get, admin):
created = tz_now()
job = Parent(created=created)
job.save()
@@ -255,7 +255,7 @@ def test_legacy_result_stdout_with_max_bytes(Cls, view, fmt, get, admin):
],
)
@pytest.mark.parametrize('fmt', ['txt', 'ansi', 'txt_download', 'ansi_download'])
def test_text_with_unicode_stdout(sqlite_copy_expert, Parent, Child, relation, view, get, admin, fmt):
def test_text_with_unicode_stdout(sqlite_copy, Parent, Child, relation, view, get, admin, fmt):
job = Parent()
job.save()
for i in range(3):
@@ -267,7 +267,7 @@ def test_text_with_unicode_stdout(sqlite_copy_expert, Parent, Child, relation, v
@pytest.mark.django_db
def test_unicode_with_base64_ansi(sqlite_copy_expert, get, admin):
def test_unicode_with_base64_ansi(sqlite_copy, get, admin):
created = tz_now()
job = Job(created=created)
job.save()

View File

@@ -1,8 +1,6 @@
# Python
import pytest
from unittest import mock
import tempfile
import shutil
import urllib.parse
from unittest.mock import PropertyMock
@@ -789,25 +787,43 @@ def oauth_application(admin):
return Application.objects.create(name='test app', user=admin, client_type='confidential', authorization_grant_type='password')
@pytest.fixture
def sqlite_copy_expert(request):
# copy_expert is postgres-specific, and SQLite doesn't support it; mock its
# behavior to test that it writes a file that contains stdout from events
path = tempfile.mkdtemp(prefix='job-event-stdout')
class MockCopy:
events = []
index = -1
def write_stdout(self, sql, fd):
# simulate postgres copy_expert support with ORM code
def __init__(self, sql):
self.events = []
parts = sql.split(' ')
tablename = parts[parts.index('from') + 1]
for cls in (JobEvent, AdHocCommandEvent, ProjectUpdateEvent, InventoryUpdateEvent, SystemJobEvent):
if cls._meta.db_table == tablename:
for event in cls.objects.order_by('start_line').all():
fd.write(event.stdout)
self.events.append(event.stdout)
setattr(SQLiteCursorWrapper, 'copy_expert', write_stdout)
request.addfinalizer(lambda: shutil.rmtree(path))
request.addfinalizer(lambda: delattr(SQLiteCursorWrapper, 'copy_expert'))
return path
def read(self):
self.index = self.index + 1
if self.index < len(self.events):
return memoryview(self.events[self.index].encode())
return None
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
pass
@pytest.fixture
def sqlite_copy(request, mocker):
# copy is postgres-specific, and SQLite doesn't support it; mock its
# behavior to test that it writes a file that contains stdout from events
def write_stdout(self, sql):
mock_copy = MockCopy(sql)
return mock_copy
mocker.patch.object(SQLiteCursorWrapper, 'copy', write_stdout, create=True)
@pytest.fixture

View File

@@ -271,6 +271,7 @@ def test_inventory_update_excessively_long_name(inventory, inventory_source):
class TestHostManager:
def test_host_filter_not_smart(self, setup_ec2_gce, organization):
smart_inventory = Inventory(name='smart', organization=organization, host_filter='inventory_sources__source=ec2')
smart_inventory.save()
assert len(smart_inventory.hosts.all()) == 0
def test_host_distinctness(self, setup_inventory_groups, organization):

View File

@@ -98,7 +98,7 @@ class TestJobNotificationMixin(object):
@pytest.mark.django_db
@pytest.mark.parametrize('JobClass', [AdHocCommand, InventoryUpdate, Job, ProjectUpdate, SystemJob, WorkflowJob])
def test_context(self, JobClass, sqlite_copy_expert, project, inventory_source):
def test_context(self, JobClass, sqlite_copy, project, inventory_source):
"""The Jinja context defines all of the fields that can be used by a template. Ensure that the context generated
for each job type has the expected structure."""
kwargs = {}

View File

@@ -0,0 +1,28 @@
# Python
from unittest import mock
import uuid
# patch python-ldap
with mock.patch('__main__.__builtins__.dir', return_value=[]):
import ldap # NOQA
# Load development settings for base variables.
from awx.settings.development import * # NOQA
# Some things make decisions based on settings.SETTINGS_MODULE, so this is done for that
SETTINGS_MODULE = 'awx.settings.development'
# Use SQLite for unit tests instead of PostgreSQL. If the lines below are
# commented out, Django will create the test_awx-dev database in PostgreSQL to
# run unit tests.
CACHES = {'default': {'BACKEND': 'django.core.cache.backends.locmem.LocMemCache', 'LOCATION': 'unique-{}'.format(str(uuid.uuid4()))}}
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'awx.sqlite3'), # noqa
'TEST': {
# Test database cannot be :memory: for inventory tests.
'NAME': os.path.join(BASE_DIR, 'awx_test.sqlite3') # noqa
},
}
}

View File

@@ -1,5 +1,5 @@
from datetime import datetime
from django.utils.timezone import utc
from datetime import timezone
import pytest
from awx.main.models import JobEvent, ProjectUpdateEvent, AdHocCommandEvent, InventoryUpdateEvent, SystemJobEvent
@@ -18,7 +18,7 @@ from awx.main.models import JobEvent, ProjectUpdateEvent, AdHocCommandEvent, Inv
@pytest.mark.parametrize('created', [datetime(2018, 1, 1).isoformat(), datetime(2018, 1, 1)])
def test_event_parse_created(job_identifier, cls, created):
event = cls.create_from_data(**{job_identifier: 123, 'created': created})
assert event.created == datetime(2018, 1, 1).replace(tzinfo=utc)
assert event.created == datetime(2018, 1, 1).replace(tzinfo=timezone.utc)
@pytest.mark.parametrize(

View File

@@ -1,8 +1,32 @@
from split_settings.tools import include
LOCAL_SETTINGS = (
'ALLOWED_HOSTS',
'BROADCAST_WEBSOCKET_PORT',
'BROADCAST_WEBSOCKET_VERIFY_CERT',
'BROADCAST_WEBSOCKET_PROTOCOL',
'BROADCAST_WEBSOCKET_SECRET',
'DATABASES',
'CACHES',
'DEBUG',
'NAMED_URL_GRAPH',
)
def test_postprocess_auth_basic_enabled():
locals().update({'__file__': __file__})
include('../../../settings/defaults.py', scope=locals())
assert 'awx.api.authentication.LoggedBasicAuthentication' in locals()['REST_FRAMEWORK']['DEFAULT_AUTHENTICATION_CLASSES']
def test_default_settings():
from django.conf import settings
for k in dir(settings):
if k not in settings.DEFAULTS_SNAPSHOT or k in LOCAL_SETTINGS:
continue
default_val = getattr(settings.default_settings, k, None)
snapshot_val = settings.DEFAULTS_SNAPSHOT[k]
assert default_val == snapshot_val, f'Setting for {k} does not match shapshot:\nsnapshot: {snapshot_val}\ndefault: {default_val}'

View File

@@ -1,24 +1,16 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
# Python
import base64
import os
import re # noqa
import sys
import tempfile
import socket
from datetime import timedelta
if "pytest" in sys.modules:
IS_TESTING_MODE = True
from unittest import mock
with mock.patch('__main__.__builtins__.dir', return_value=[]):
import ldap
else:
IS_TESTING_MODE = False
import ldap
# python-ldap
import ldap
DEBUG = True
@@ -403,6 +395,7 @@ AUTHENTICATION_BACKENDS = (
OAUTH2_PROVIDER_APPLICATION_MODEL = 'main.OAuth2Application'
OAUTH2_PROVIDER_ACCESS_TOKEN_MODEL = 'main.OAuth2AccessToken'
OAUTH2_PROVIDER_REFRESH_TOKEN_MODEL = 'oauth2_provider.RefreshToken'
OAUTH2_PROVIDER_ID_TOKEN_MODEL = "oauth2_provider.IDToken"
OAUTH2_PROVIDER = {'ACCESS_TOKEN_EXPIRE_SECONDS': 31536000000, 'AUTHORIZATION_CODE_EXPIRE_SECONDS': 600, 'REFRESH_TOKEN_EXPIRE_SECONDS': 2628000}
ALLOW_OAUTH2_FOR_EXTERNAL_USERS = False

View File

@@ -9,7 +9,6 @@ import socket
import copy
import sys
import traceback
import uuid
# Centos-7 doesn't include the svg mime type
# /usr/lib64/python/mimetypes.py
@@ -62,38 +61,9 @@ DEBUG_TOOLBAR_CONFIG = {'ENABLE_STACKTRACES': True}
SYSTEM_UUID = '00000000-0000-0000-0000-000000000000'
INSTALL_UUID = '00000000-0000-0000-0000-000000000000'
# Store a snapshot of default settings at this point before loading any
# customizable config files.
DEFAULTS_SNAPSHOT = {}
this_module = sys.modules[__name__]
for setting in dir(this_module):
if setting == setting.upper():
DEFAULTS_SNAPSHOT[setting] = copy.deepcopy(getattr(this_module, setting))
# If there is an `/etc/tower/settings.py`, include it.
# If there is a `/etc/tower/conf.d/*.py`, include them.
include(optional('/etc/tower/settings.py'), scope=locals())
include(optional('/etc/tower/conf.d/*.py'), scope=locals())
BASE_VENV_PATH = "/var/lib/awx/venv/"
AWX_VENV_PATH = os.path.join(BASE_VENV_PATH, "awx")
# Use SQLite for unit tests instead of PostgreSQL. If the lines below are
# commented out, Django will create the test_awx-dev database in PostgreSQL to
# run unit tests.
if "pytest" in sys.modules:
CACHES = {'default': {'BACKEND': 'django.core.cache.backends.locmem.LocMemCache', 'LOCATION': 'unique-{}'.format(str(uuid.uuid4()))}}
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'awx.sqlite3'), # noqa
'TEST': {
# Test database cannot be :memory: for inventory tests.
'NAME': os.path.join(BASE_DIR, 'awx_test.sqlite3') # noqa
},
}
}
CLUSTER_HOST_ID = socket.gethostname()
AWX_CALLBACK_PROFILE = True
@@ -105,11 +75,28 @@ AWX_CALLBACK_PROFILE = True
AWX_DISABLE_TASK_MANAGERS = False
# ======================!!!!!!! FOR DEVELOPMENT ONLY !!!!!!!=================================
from .application_name import set_application_name
# Store a snapshot of default settings at this point before loading any
# customizable config files.
this_module = sys.modules[__name__]
local_vars = dir(this_module)
DEFAULTS_SNAPSHOT = {} # define after we save local_vars so we do not snapshot the snapshot
for setting in local_vars:
if setting.isupper():
DEFAULTS_SNAPSHOT[setting] = copy.deepcopy(getattr(this_module, setting))
set_application_name(DATABASES, CLUSTER_HOST_ID)
del local_vars # avoid temporary variables from showing up in dir(settings)
del this_module
#
###############################################################################################
#
# Any settings defined after this point will be marked as as a read_only database setting
#
################################################################################################
del set_application_name
# If there is an `/etc/tower/settings.py`, include it.
# If there is a `/etc/tower/conf.d/*.py`, include them.
include(optional('/etc/tower/settings.py'), scope=locals())
include(optional('/etc/tower/conf.d/*.py'), scope=locals())
# If any local_*.py files are present in awx/settings/, use them to override
# default settings for development. If not present, we can still run using
@@ -123,3 +110,11 @@ try:
except ImportError:
traceback.print_exc()
sys.exit(1)
# The below runs AFTER all of the custom settings are imported
# because conf.d files will define DATABASES and this should modify that
from .application_name import set_application_name
set_application_name(DATABASES, CLUSTER_HOST_ID) # NOQA
del set_application_name

View File

@@ -47,17 +47,21 @@ AWX_ISOLATION_SHOW_PATHS = [
# Store a snapshot of default settings at this point before loading any
# customizable config files.
this_module = sys.modules[__name__]
local_vars = dir(this_module)
DEFAULTS_SNAPSHOT = {} # define after we save local_vars so we do not snapshot the snapshot
for setting in local_vars:
if setting.isupper():
DEFAULTS_SNAPSHOT[setting] = copy.deepcopy(getattr(this_module, setting))
del local_vars # avoid temporary variables from showing up in dir(settings)
del this_module
#
###############################################################################################
#
# Any settings defined after this point will be marked as as a read_only database setting
#
################################################################################################
DEFAULTS_SNAPSHOT = {}
this_module = sys.modules[__name__]
for setting in dir(this_module):
if setting == setting.upper():
DEFAULTS_SNAPSHOT[setting] = copy.deepcopy(getattr(this_module, setting))
# Load settings from any .py files in the global conf.d directory specified in
# the environment, defaulting to /etc/tower/conf.d/.
@@ -98,8 +102,8 @@ except IOError:
else:
raise
# The below runs AFTER all of the custom settings are imported.
# The below runs AFTER all of the custom settings are imported
# because conf.d files will define DATABASES and this should modify that
from .application_name import set_application_name
set_application_name(DATABASES, CLUSTER_HOST_ID) # NOQA

View File

@@ -115,16 +115,16 @@ function AdHocCredentialStep({ credentialTypeId }) {
searchColumns={[
{
name: t`Name`,
key: 'name',
key: 'name__icontains',
isDefault: true,
},
{
name: t`Created By (Username)`,
key: 'created_by__username',
key: 'created_by__username__icontains',
},
{
name: t`Modified By (Username)`,
key: 'modified_by__username',
key: 'modified_by__username__icontains',
},
]}
sortColumns={[

View File

@@ -195,9 +195,9 @@ function getRouteConfig(userProfile = {}) {
deleteRoute('host_metrics');
deleteRouteGroup('settings');
deleteRoute('management_jobs');
if (userProfile?.isOrgAdmin) return routeConfig;
deleteRoute('topology_view');
deleteRoute('instances');
if (userProfile?.isOrgAdmin) return routeConfig;
if (!userProfile?.isNotificationAdmin) deleteRoute('notification_templates');
return routeConfig;

View File

@@ -101,10 +101,8 @@ describe('getRouteConfig', () => {
'/credential_types',
'/notification_templates',
'/instance_groups',
'/instances',
'/applications',
'/execution_environments',
'/topology_view',
]);
});
@@ -237,10 +235,8 @@ describe('getRouteConfig', () => {
'/credential_types',
'/notification_templates',
'/instance_groups',
'/instances',
'/applications',
'/execution_environments',
'/topology_view',
]);
});
@@ -268,10 +264,8 @@ describe('getRouteConfig', () => {
'/credential_types',
'/notification_templates',
'/instance_groups',
'/instances',
'/applications',
'/execution_environments',
'/topology_view',
]);
});
});

View File

@@ -131,12 +131,6 @@ function HostMetrics() {
>
{t`Automation`}
</HeaderCell>
<HeaderCell
sortKey="used_in_inventories"
tooltip={t`How many inventories is the host in, recomputed on a weekly schedule`}
>
{t`Inventories`}
</HeaderCell>
<HeaderCell
sortKey="deleted_counter"
tooltip={t`How many times was the host deleted`}

View File

@@ -21,7 +21,6 @@ function HostMetricsListItem({ item, isSelected, onSelect, rowIndex }) {
{formatDateString(item.last_automation)}
</Td>
<Td dataLabel={t`Automation`}>{item.automated_counter}</Td>
<Td dataLabel={t`Inventories`}>{item.used_in_inventories || 0}</Td>
<Td dataLabel={t`Deleted`}>{item.deleted_counter}</Td>
</Tr>
);

View File

@@ -4,6 +4,7 @@ import { t } from '@lingui/macro';
import { Switch, Route, Redirect, Link, useRouteMatch } from 'react-router-dom';
import { CaretLeftIcon } from '@patternfly/react-icons';
import { Card, PageSection } from '@patternfly/react-core';
import { useConfig } from 'contexts/Config';
import ContentError from 'components/ContentError';
import RoutedTabs from 'components/RoutedTabs';
import useRequest from 'hooks/useRequest';
@@ -13,6 +14,9 @@ import InstanceDetail from './InstanceDetail';
import InstancePeerList from './InstancePeers';
function Instance({ setBreadcrumb }) {
const { me } = useConfig();
const canReadSettings = me.is_superuser || me.is_system_auditor;
const match = useRouteMatch();
const tabsArray = [
{
@@ -30,19 +34,21 @@ function Instance({ setBreadcrumb }) {
];
const {
result: { isK8s },
result: isK8s,
error,
isLoading,
request,
} = useRequest(
useCallback(async () => {
if (!canReadSettings) {
return false;
}
const { data } = await SettingsAPI.readCategory('system');
return {
isK8s: data.IS_K8S,
};
}, []),
return data?.IS_K8S ?? false;
}, [canReadSettings]),
{ isK8s: false, isLoading: true }
);
useEffect(() => {
request();
}, [request]);

View File

@@ -36,6 +36,7 @@ const QS_CONFIG = getQSConfig('instance', {
function InstanceList() {
const location = useLocation();
const { me } = useConfig();
const canReadSettings = me.is_superuser || me.is_system_auditor;
const [showHealthCheckAlert, setShowHealthCheckAlert] = useState(false);
const [pendingHealthCheck, setPendingHealthCheck] = useState(false);
const [canRunHealthCheck, setCanRunHealthCheck] = useState(true);
@@ -48,18 +49,24 @@ function InstanceList() {
} = useRequest(
useCallback(async () => {
const params = parseQueryString(QS_CONFIG, location.search);
const [response, responseActions, sysSettings] = await Promise.all([
const [response, responseActions] = await Promise.all([
InstancesAPI.read(params),
InstancesAPI.readOptions(),
SettingsAPI.readCategory('system'),
]);
let sysSettings = {};
if (canReadSettings) {
sysSettings = await SettingsAPI.readCategory('system');
}
const isPending = response.data.results.some(
(i) => i.health_check_pending === true
);
setPendingHealthCheck(isPending);
return {
instances: response.data.results,
isK8s: sysSettings.data.IS_K8S,
isK8s: sysSettings?.data?.IS_K8S ?? false,
count: response.data.count,
actions: responseActions.data.actions,
relatedSearchableKeys: (
@@ -67,7 +74,7 @@ function InstanceList() {
).map((val) => val.slice(0, -8)),
searchableKeys: getSearchableKeys(responseActions.data.actions?.GET),
};
}, [location.search]),
}, [location.search, canReadSettings]),
{
instances: [],
count: 0,

View File

@@ -17,11 +17,7 @@ import { CardBody, CardActionsRow } from 'components/Card';
import { Detail, DetailList, UserDateDetail } from 'components/DetailList';
import { VariablesDetail } from 'components/CodeEditor';
import { formatDateString, secondsToHHMMSS } from 'util/dates';
import {
WorkflowApprovalsAPI,
WorkflowJobTemplatesAPI,
WorkflowJobsAPI,
} from 'api';
import { WorkflowApprovalsAPI, WorkflowJobsAPI } from 'api';
import useRequest, { useDismissableError } from 'hooks/useRequest';
import { WorkflowApproval } from 'types';
import StatusLabel from 'components/StatusLabel';
@@ -67,8 +63,10 @@ function WorkflowApprovalDetail({ workflowApproval, fetchWorkflowApproval }) {
const { error: deleteError, dismissError: dismissDeleteError } =
useDismissableError(deleteApprovalError);
const workflowJobTemplateId =
workflowApproval.summary_fields.workflow_job_template.id;
const sourceWorkflowJob =
workflowApproval?.summary_fields?.source_workflow_job;
const sourceWorkflowJobTemplate =
workflowApproval?.summary_fields?.workflow_job_template;
const {
error: fetchWorkflowJobError,
@@ -77,23 +75,10 @@ function WorkflowApprovalDetail({ workflowApproval, fetchWorkflowApproval }) {
result: workflowJob,
} = useRequest(
useCallback(async () => {
if (!workflowJobTemplateId) {
return {};
}
const { data: workflowJobTemplate } =
await WorkflowJobTemplatesAPI.readDetail(workflowJobTemplateId);
let jobId = null;
if (workflowJobTemplate.summary_fields?.current_job) {
jobId = workflowJobTemplate.summary_fields.current_job.id;
} else if (workflowJobTemplate.summary_fields?.last_job) {
jobId = workflowJobTemplate.summary_fields.last_job.id;
}
const { data } = await WorkflowJobsAPI.readDetail(jobId);
if (!sourceWorkflowJob?.id) return {};
const { data } = await WorkflowJobsAPI.readDetail(sourceWorkflowJob?.id);
return data;
}, [workflowJobTemplateId]),
}, [sourceWorkflowJob?.id]),
{
workflowJob: null,
isLoading: true,
@@ -116,11 +101,6 @@ function WorkflowApprovalDetail({ workflowApproval, fetchWorkflowApproval }) {
},
[addToast, fetchWorkflowApproval]
);
const sourceWorkflowJob =
workflowApproval?.summary_fields?.source_workflow_job;
const sourceWorkflowJobTemplate =
workflowApproval?.summary_fields?.workflow_job_template;
const isLoading = isDeleteLoading || isLoadingWorkflowJob;

View File

@@ -1,10 +1,6 @@
import React from 'react';
import { act } from 'react-dom/test-utils';
import {
WorkflowApprovalsAPI,
WorkflowJobTemplatesAPI,
WorkflowJobsAPI,
} from 'api';
import { WorkflowApprovalsAPI, WorkflowJobsAPI } from 'api';
import { formatDateString } from 'util/dates';
import {
mountWithContexts,
@@ -23,146 +19,6 @@ jest.mock('react-router-dom', () => ({
}),
}));
const workflowJobTemplate = {
id: 8,
type: 'workflow_job_template',
url: '/api/v2/workflow_job_templates/8/',
related: {
named_url: '/api/v2/workflow_job_templates/00++/',
created_by: '/api/v2/users/1/',
modified_by: '/api/v2/users/1/',
last_job: '/api/v2/workflow_jobs/111/',
workflow_jobs: '/api/v2/workflow_job_templates/8/workflow_jobs/',
schedules: '/api/v2/workflow_job_templates/8/schedules/',
launch: '/api/v2/workflow_job_templates/8/launch/',
webhook_key: '/api/v2/workflow_job_templates/8/webhook_key/',
webhook_receiver: '/api/v2/workflow_job_templates/8/github/',
workflow_nodes: '/api/v2/workflow_job_templates/8/workflow_nodes/',
labels: '/api/v2/workflow_job_templates/8/labels/',
activity_stream: '/api/v2/workflow_job_templates/8/activity_stream/',
notification_templates_started:
'/api/v2/workflow_job_templates/8/notification_templates_started/',
notification_templates_success:
'/api/v2/workflow_job_templates/8/notification_templates_success/',
notification_templates_error:
'/api/v2/workflow_job_templates/8/notification_templates_error/',
notification_templates_approvals:
'/api/v2/workflow_job_templates/8/notification_templates_approvals/',
access_list: '/api/v2/workflow_job_templates/8/access_list/',
object_roles: '/api/v2/workflow_job_templates/8/object_roles/',
survey_spec: '/api/v2/workflow_job_templates/8/survey_spec/',
copy: '/api/v2/workflow_job_templates/8/copy/',
},
summary_fields: {
last_job: {
id: 111,
name: '00',
description: '',
finished: '2022-05-10T17:29:52.978531Z',
status: 'successful',
failed: false,
},
last_update: {
id: 111,
name: '00',
description: '',
status: 'successful',
failed: false,
},
created_by: {
id: 1,
username: 'admin',
first_name: '',
last_name: '',
},
modified_by: {
id: 1,
username: 'admin',
first_name: '',
last_name: '',
},
object_roles: {
admin_role: {
description: 'Can manage all aspects of the workflow job template',
name: 'Admin',
id: 34,
},
execute_role: {
description: 'May run the workflow job template',
name: 'Execute',
id: 35,
},
read_role: {
description: 'May view settings for the workflow job template',
name: 'Read',
id: 36,
},
approval_role: {
description: 'Can approve or deny a workflow approval node',
name: 'Approve',
id: 37,
},
},
user_capabilities: {
edit: true,
delete: true,
start: true,
schedule: true,
copy: true,
},
labels: {
count: 1,
results: [
{
id: 2,
name: 'Test2',
},
],
},
survey: {
title: '',
description: '',
},
recent_jobs: [
{
id: 111,
status: 'successful',
finished: '2022-05-10T17:29:52.978531Z',
canceled_on: null,
type: 'workflow_job',
},
{
id: 104,
status: 'failed',
finished: '2022-05-10T15:26:22.233170Z',
canceled_on: null,
type: 'workflow_job',
},
],
},
created: '2022-05-05T14:13:36.123027Z',
modified: '2022-05-05T17:44:44.071447Z',
name: '00',
description: '',
last_job_run: '2022-05-10T17:29:52.978531Z',
last_job_failed: false,
next_job_run: null,
status: 'successful',
extra_vars: '{\n "foo": "bar",\n "baz": "qux"\n}',
organization: null,
survey_enabled: true,
allow_simultaneous: true,
ask_variables_on_launch: true,
inventory: null,
limit: null,
scm_branch: '',
ask_inventory_on_launch: true,
ask_scm_branch_on_launch: true,
ask_limit_on_launch: true,
webhook_service: 'github',
webhook_credential: null,
};
const workflowJob = {
id: 111,
type: 'workflow_job',
@@ -270,9 +126,6 @@ const workflowJob = {
describe('<WorkflowApprovalDetail />', () => {
beforeEach(() => {
WorkflowJobTemplatesAPI.readDetail.mockResolvedValue({
data: workflowJobTemplate,
});
WorkflowJobsAPI.readDetail.mockResolvedValue({ data: workflowJob });
});
@@ -482,9 +335,6 @@ describe('<WorkflowApprovalDetail />', () => {
});
test('should not load Labels', async () => {
WorkflowJobTemplatesAPI.readDetail.mockResolvedValue({
data: workflowJobTemplate,
});
WorkflowJobsAPI.readDetail.mockResolvedValue({
data: {
...workflowApproval,
@@ -621,4 +471,16 @@ describe('<WorkflowApprovalDetail />', () => {
(el) => el.length === 0
);
});
test('should fetch its workflow job details', async () => {
let wrapper;
await act(async () => {
wrapper = mountWithContexts(
<WorkflowApprovalDetail workflowApproval={workflowApproval} />
);
});
waitForElement(wrapper, 'WorkflowApprovalDetail', (el) => el.length > 0);
expect(WorkflowJobsAPI.readDetail).toHaveBeenCalledTimes(1);
expect(WorkflowJobsAPI.readDetail).toHaveBeenCalledWith(216);
});
});

View File

@@ -87,7 +87,7 @@ options:
update_secrets:
description:
- C(true) will always update encrypted values.
- C(false) will only updated encrypted values if a change is absolutely known to be needed.
- C(false) will only update encrypted values if a change is absolutely known to be needed.
type: bool
default: true
user:
@@ -100,8 +100,8 @@ options:
type: str
state:
description:
- Desired state of the resource.
choices: ["present", "absent"]
- Desired state of the resource. C(exists) will not modify the resource if it is present.
choices: ["present", "absent", "exists"]
default: "present"
type: str
@@ -216,7 +216,7 @@ def main():
update_secrets=dict(type='bool', default=True, no_log=False),
user=dict(),
team=dict(),
state=dict(choices=['present', 'absent'], default='present'),
state=dict(choices=['present', 'absent', 'exists'], default='present'),
)
# Create a module for ourselves
@@ -265,6 +265,10 @@ def main():
# If the state was absent we can let the module delete it if needed, the module will handle exiting from this
module.delete_if_needed(credential)
if state == 'exists' and credential is not None:
# If credential exists and state is exists, we're done here.
module.exit_json(**module.json_output)
# Attempt to look up the related items the user specified (these will fail the module if not found)
if user:
user_id = module.resolve_name_to_id('users', user)

View File

@@ -10,7 +10,7 @@ from awx.main.models import WorkflowJob
@pytest.mark.django_db
def test_bulk_job_launch(run_module, admin_user, job_template):
jobs = [dict(unified_job_template=job_template.id)]
run_module(
result = run_module(
'bulk_job_launch',
{
'name': "foo-bulk-job",
@@ -21,6 +21,8 @@ def test_bulk_job_launch(run_module, admin_user, job_template):
},
admin_user,
)
assert not result.get('failed', False), result.get('msg', result)
assert result.get('changed'), result
bulk_job = WorkflowJob.objects.get(name="foo-bulk-job")
assert bulk_job.extra_vars == '{"animal": "owl"}'
@@ -30,7 +32,7 @@ def test_bulk_job_launch(run_module, admin_user, job_template):
@pytest.mark.django_db
def test_bulk_host_create(run_module, admin_user, inventory):
hosts = [dict(name="127.0.0.1"), dict(name="foo.dns.org")]
run_module(
result = run_module(
'bulk_host_create',
{
'inventory': inventory.name,
@@ -38,6 +40,8 @@ def test_bulk_host_create(run_module, admin_user, inventory):
},
admin_user,
)
assert not result.get('failed', False), result.get('msg', result)
assert result.get('changed'), result
resp_hosts = inventory.hosts.all().values_list('name', flat=True)
for h in hosts:
assert h['name'] in resp_hosts

View File

@@ -132,3 +132,20 @@ def test_secret_field_write_twice(run_module, organization, admin_user, cred_typ
else:
assert result.get('changed') is False, result
assert Credential.objects.get(id=result['id']).get_input('token') == val1
@pytest.mark.django_db
@pytest.mark.parametrize('state', ('present', 'absent', 'exists'))
def test_credential_state(run_module, organization, admin_user, cred_type, state):
result = run_module(
'credential',
dict(
name='Galaxy Token for Steve',
organization=organization.name,
credential_type=cred_type.name,
inputs={'token': '7rEZK38DJl58A7RxA6EC7lLvUHbBQ1'},
state=state,
),
admin_user,
)
assert not result.get('failed', False), result.get('msg', result)

View File

@@ -46,27 +46,24 @@
that:
- "command is changed"
- name: Timeout waiting for the command to cancel
- name: Cancel the command
ad_hoc_command_cancel:
command_id: "{{ command.id }}"
timeout: -1
register: results
ignore_errors: true
- assert:
that:
- results is failed
- "results['msg'] == 'Monitoring of ad hoc command aborted due to timeout'"
- results is changed
- block:
- name: "Wait for up to a minute until the job enters the can_cancel: False state"
debug:
msg: "The job can_cancel status has transitioned into False, we can proceed with testing"
until: not job_status
retries: 6
delay: 10
vars:
job_status: "{{ lookup('awx.awx.controller_api', 'ad_hoc_commands/'+ command.id | string +'/cancel')['can_cancel'] }}"
- name: "Wait for up to a minute until the job enters the can_cancel: False state"
debug:
msg: "The job can_cancel status has transitioned into False, we can proceed with testing"
until: not job_status
retries: 6
delay: 10
vars:
job_status: "{{ lookup('awx.awx.controller_api', 'ad_hoc_commands/'+ command.id | string +'/cancel')['can_cancel'] }}"
- name: Cancel the command with hard error if it's not running
ad_hoc_command_cancel:

View File

@@ -108,9 +108,8 @@
- assert:
that:
- wait_results is failed
- 'wait_results.status == "canceled"'
- "wait_results.msg == 'The ad hoc command - {{ command.id }}, failed'"
- wait_results is successful
- 'wait_results.status == "successful"'
- name: Delete the Credential
credential:

View File

@@ -2,6 +2,7 @@
- name: Generate a test id
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:

View File

@@ -178,6 +178,60 @@
that:
- result is changed
- name: Delete an SSH credential
credential:
name: "{{ ssh_cred_name2 }}"
organization: Default
state: absent
credential_type: Machine
register: result
- assert:
that:
- "result is changed"
- name: Ensure existence of SSH credential
credential:
name: "{{ ssh_cred_name2 }}"
organization: Default
state: exists
credential_type: Machine
description: An example SSH awx.awx.credential
inputs:
username: joe
password: secret
become_method: sudo
become_username: superuser
become_password: supersecret
ssh_key_data: "{{ ssh_key_data }}"
ssh_key_unlock: "passphrase"
register: result
- assert:
that:
- result is changed
- name: Ensure existence of SSH credential, not updating any inputs
credential:
name: "{{ ssh_cred_name2 }}"
organization: Default
state: exists
credential_type: Machine
description: An example SSH awx.awx.credential
inputs:
username: joe
password: no-update-secret
become_method: sudo
become_username: some-other-superuser
become_password: some-other-secret
ssh_key_data: "{{ ssh_key_data }}"
ssh_key_unlock: "another-pass-phrase"
register: result
- assert:
that:
- result is not changed
- name: Create an invalid SSH credential (passphrase required)
credential:
name: SSH Credential

View File

@@ -1,7 +1,12 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
cred_type_name: "AWX-Collection-tests-credential_type-cred-type-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
cred_type_name: "AWX-Collection-tests-credential_type-cred-type-{{ test_id }}"
- block:
- name: Add Tower credential type

View File

@@ -1,13 +1,18 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
group_name1: "AWX-Collection-tests-group-group-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
group_name2: "AWX-Collection-tests-group-group-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
group_name3: "AWX-Collection-tests-group-group-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
inv_name: "AWX-Collection-tests-group-inv-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
host_name1: "AWX-Collection-tests-group-host-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
host_name2: "AWX-Collection-tests-group-host-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
host_name3: "AWX-Collection-tests-group-host-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
group_name1: "AWX-Collection-tests-group-group1-{{ test_id }}"
group_name2: "AWX-Collection-tests-group-group2-{{ test_id }}"
group_name3: "AWX-Collection-tests-group-group3-{{ test_id }}"
inv_name: "AWX-Collection-tests-group-inv-{{ test_id }}"
host_name1: "AWX-Collection-tests-group-host1-{{ test_id }}"
host_name2: "AWX-Collection-tests-group-host2-{{ test_id }}"
host_name3: "AWX-Collection-tests-group-host3-{{ test_id }}"
- name: Create an Inventory
inventory:
@@ -117,9 +122,10 @@
state: absent
register: result
# In this case, group 2 was last a child of group1 so deleting group1 deleted group2
- assert:
that:
- "result is changed"
- "result is not changed"
- name: Delete a Group
group:

View File

@@ -1,8 +1,13 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
host_name: "AWX-Collection-tests-host-host-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
inv_name: "AWX-Collection-tests-host-inv-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
host_name: "AWX-Collection-tests-host-host-{{ test_id }}"
inv_name: "AWX-Collection-tests-host-inv-{{ test_id }}"
- name: Create an Inventory
inventory:

View File

@@ -1,14 +1,25 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate hostnames
set_fact:
hostname1: "AWX-Collection-tests-instance1.{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}.example.com"
hostname2: "AWX-Collection-tests-instance2.{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}.example.com"
hostname3: "AWX-Collection-tests-instance3.{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}.example.com"
hostname1: "AWX-Collection-tests-instance1.{{ test_id }}.example.com"
hostname2: "AWX-Collection-tests-instance2.{{ test_id }}.example.com"
hostname3: "AWX-Collection-tests-instance3.{{ test_id }}.example.com"
register: facts
- name: Show hostnames
debug:
var: facts
- name: Get the k8s setting
set_fact:
IS_K8S: "{{ controller_settings['IS_K8S'] | default(False) }}"
vars:
controller_settings: "{{ lookup('awx.awx.controller_api', 'settings/all') }}"
- debug:
msg: "Skipping instance test since this is instance is not running on a K8s platform"
when: not IS_K8S
- block:
- name: Create an instance
@@ -57,3 +68,5 @@
- "{{ hostname1 }}"
- "{{ hostname2 }}"
- "{{ hostname3 }}"
when: IS_K8S

View File

@@ -2,6 +2,7 @@
- name: Generate test id
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:

View File

@@ -2,6 +2,7 @@
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:

View File

@@ -1,9 +1,14 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
openstack_cred: "AWX-Collection-tests-inventory_source-cred-openstack-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
openstack_inv: "AWX-Collection-tests-inventory_source-inv-openstack-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
openstack_inv_source: "AWX-Collection-tests-inventory_source-inv-source-openstack-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
openstack_cred: "AWX-Collection-tests-inventory_source-cred-openstack-{{ test_id }}"
openstack_inv: "AWX-Collection-tests-inventory_source-inv-openstack-{{ test_id }}"
openstack_inv_source: "AWX-Collection-tests-inventory_source-inv-source-openstack-{{ test_id }}"
- name: Add a credential
credential:

View File

@@ -2,6 +2,7 @@
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:

View File

@@ -1,9 +1,14 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
jt_name1: "AWX-Collection-tests-job_launch-jt1-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
jt_name2: "AWX-Collection-tests-job_launch-jt2-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
proj_name: "AWX-Collection-tests-job_launch-project-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
jt_name1: "AWX-Collection-tests-job_launch-jt1-{{ test_id }}"
jt_name2: "AWX-Collection-tests-job_launch-jt2-{{ test_id }}"
proj_name: "AWX-Collection-tests-job_launch-project-{{ test_id }}"
- name: Launch a Job Template
job_launch:

View File

@@ -2,6 +2,7 @@
- name: Generate a random string for test
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: generate random string for project
set_fact:

View File

@@ -1,8 +1,13 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate random string for template and project
set_fact:
jt_name: "AWX-Collection-tests-job_wait-long_running-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
proj_name: "AWX-Collection-tests-job_wait-long_running-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
jt_name: "AWX-Collection-tests-job_wait-long_running-{{ test_id }}"
proj_name: "AWX-Collection-tests-job_wait-long_running-{{ test_id }}"
- name: Assure that the demo project exists
project:

View File

@@ -1,7 +1,12 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
label_name: "AWX-Collection-tests-label-label-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
label_name: "AWX-Collection-tests-label-label-{{ test_id }}"
- name: Create a Label
label:

View File

@@ -101,6 +101,9 @@
set_fact:
users: "{{ query(plugin_name, 'users', query_params={ 'username__endswith': test_id, 'page_size': 2 }, return_ids=True ) }}"
- debug:
msg: "{{ users }}"
- name: Assert that user list has 2 ids only and that they are strings, not ints
assert:
that:

View File

@@ -1,12 +1,17 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
slack_not: "AWX-Collection-tests-notification_template-slack-not-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
webhook_not: "AWX-Collection-tests-notification_template-wehbook-not-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
email_not: "AWX-Collection-tests-notification_template-email-not-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
twillo_not: "AWX-Collection-tests-notification_template-twillo-not-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
pd_not: "AWX-Collection-tests-notification_template-pd-not-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
irc_not: "AWX-Collection-tests-notification_template-irc-not-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
slack_not: "AWX-Collection-tests-notification_template-slack-not-{{ test_id }}"
webhook_not: "AWX-Collection-tests-notification_template-wehbook-not-{{ test_id }}"
email_not: "AWX-Collection-tests-notification_template-email-not-{{ test_id }}"
twillo_not: "AWX-Collection-tests-notification_template-twillo-not-{{ test_id }}"
pd_not: "AWX-Collection-tests-notification_template-pd-not-{{ test_id }}"
irc_not: "AWX-Collection-tests-notification_template-irc-not-{{ test_id }}"
- name: Create Slack notification with custom messages
notification_template:

View File

@@ -2,6 +2,7 @@
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate an org name
set_fact:

View File

@@ -1,13 +1,18 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
project_name1: "AWX-Collection-tests-project-project1-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
project_name2: "AWX-Collection-tests-project-project2-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
project_name3: "AWX-Collection-tests-project-project3-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
jt1: "AWX-Collection-tests-project-jt1-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
scm_cred_name: "AWX-Collection-tests-project-scm-cred-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
org_name: "AWX-Collection-tests-project-org-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
cred_name: "AWX-Collection-tests-project-cred-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
project_name1: "AWX-Collection-tests-project-project1-{{ test_id }}"
project_name2: "AWX-Collection-tests-project-project2-{{ test_id }}"
project_name3: "AWX-Collection-tests-project-project3-{{ test_id }}"
jt1: "AWX-Collection-tests-project-jt1-{{ test_id }}"
scm_cred_name: "AWX-Collection-tests-project-scm-cred-{{ test_id }}"
org_name: "AWX-Collection-tests-project-org-{{ test_id }}"
cred_name: "AWX-Collection-tests-project-cred-{{ test_id }}"
- block:
- name: Create an SCM Credential

View File

@@ -1,56 +0,0 @@
---
- name: Load the UI settings
set_fact:
project_base_dir: "{{ controller_settings.project_base_dir }}"
vars:
controller_settings: "{{ lookup('awx.awx.controller_api', 'config/') }}"
- inventory:
name: localhost
organization: Default
- host:
name: localhost
inventory: localhost
variables:
ansible_connection: local
- name: Create an unused SSH / Machine credential
credential:
name: dummy
credential_type: Machine
inputs:
ssh_key_data: |
-----BEGIN EC PRIVATE KEY-----
MHcCAQEEIIUl6R1xgzR6siIUArz4XBPtGZ09aetma2eWf1v3uYymoAoGCCqGSM49
AwEHoUQDQgAENJNjgeZDAh/+BY860s0yqrLDprXJflY0GvHIr7lX3ieCtrzOMCVU
QWzw35pc5tvuP34SSi0ZE1E+7cVMDDOF3w==
-----END EC PRIVATE KEY-----
organization: Default
- block:
- name: Add a path to a setting
settings:
name: AWX_ISOLATION_SHOW_PATHS
value: "[{{ project_base_dir }}]"
- name: Create a directory for manual project
ad_hoc_command:
credential: dummy
inventory: localhost
job_type: run
module_args: "mkdir -p {{ project_base_dir }}/{{ project_dir_name }}"
module_name: command
wait: true
always:
- name: Delete path from setting
settings:
name: AWX_ISOLATION_SHOW_PATHS
value: []
- name: Delete dummy credential
credential:
name: dummy
credential_type: Machine
state: absent

View File

@@ -1,38 +0,0 @@
---
- name: Generate random string for project
set_fact:
rand_string: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
- name: Generate manual project name
set_fact:
project_name: "Manual_Project_{{ rand_string }}"
- name: Generate manual project dir name
set_fact:
project_dir_name: "proj_{{ rand_string }}"
- name: Create a project directory for manual project
import_tasks: create_project_dir.yml
- name: Create a manual project
project:
name: "{{ project_name }}"
organization: Default
scm_type: manual
local_path: "{{ project_dir_name }}"
register: result
- assert:
that:
- "result is changed"
- name: Delete a manual project
project:
name: "{{ project_name }}"
organization: Default
state: absent
register: result
- assert:
that:
- "result is changed"

View File

@@ -2,6 +2,7 @@
- name: Generate a test id
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:

View File

@@ -2,6 +2,7 @@
- name: Generate a random string for test
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: generate random string for schedule
set_fact:

View File

@@ -1,7 +1,12 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
team_name: "AWX-Collection-tests-team-team-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
team_name: "AWX-Collection-tests-team-team-{{ test_id }}"
- name: Attempt to add a team to a non-existant Organization
team:

View File

@@ -1,7 +1,12 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
token_description: "AWX-Collection-tests-token-description-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
token_description: "AWX-Collection-tests-token-description-{{ test_id }}"
- name: Try to use a token as a dict which is missing the token parameter
job_list:

View File

@@ -1,7 +1,12 @@
---
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate names
set_fact:
username: "AWX-Collection-tests-user-user-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
username: "AWX-Collection-tests-user-user-{{ test_id }}"
- name: Create a User
user:
@@ -131,10 +136,6 @@
'Can not verify ssl with non-https protocol' in result.exception"
- block:
- name: Generate a test ID
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
- name: Generate an org name
set_fact:
org_name: "AWX-Collection-tests-organization-org-{{ test_id }}"

View File

@@ -2,13 +2,15 @@
- name: Generate a random string for names
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
test_prefix: AWX-Collection-tests-workflow_approval
when: test_id is not defined
- name: Generate random names for test objects
set_fact:
org_name: "{{ test_prefix }}-org-{{ test_id }}"
approval_node_name: "{{ test_prefix }}-node-{{ test_id }}"
wfjt_name: "{{ test_prefix }}-wfjt-{{ test_id }}"
vars:
test_prefix: AWX-Collection-tests-workflow_approval
- block:
- name: Create a new organization for test isolation

View File

@@ -2,6 +2,7 @@
- name: Generate a random string for names
set_fact:
test_id: "{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
when: test_id is not defined
- name: Generate random names for test objects
set_fact:
@@ -17,8 +18,8 @@
webhook_wfjt_name: "AWX-Collection-tests-workflow_job_template-webhook-wfjt-{{ test_id }}"
email_not: "AWX-Collection-tests-job_template-email-not-{{ test_id }}"
webhook_notification: "AWX-Collection-tests-notification_template-wehbook-not-{{ test_id }}"
project_inv: "AWX-Collection-tests-inventory_source-inv-project-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
project_inv_source: "AWX-Collection-tests-inventory_source-inv-source-project-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
project_inv: "AWX-Collection-tests-inventory_source-inv-project-{{ test_id }}"
project_inv_source: "AWX-Collection-tests-inventory_source-inv-source-project-{{ test_id }}"
github_webhook_credential_name: "AWX-Collection-tests-credential-webhook-{{ test_id }}_github"
ee1: "AWX-Collection-tests-workflow_job_template-ee1-{{ test_id }}"
label1: "AWX-Collection-tests-workflow_job_template-l1-{{ test_id }}"

View File

@@ -12,8 +12,13 @@ To encrypt secret fields, AWX uses AES in CBC mode with a 256-bit key for encryp
If necessary, credentials and encrypted settings can be extracted using the AWX shell:
```python
# awx-manage shell_plus
$ awx-manage shell_plus
>>> from awx.main.utils import decrypt_field
>>> cred = Credential.objects.get(name="my private key")
>>> print(decrypt_field(cred, "ssh_key_data"))
>>> print(decrypt_field(Credential.objects.get(name="my private key"), "ssh_key_data")) # Example for a credential
>>> print(decrypt_field(Setting.objects.get(key='SOCIAL_AUTH_AZUREAD_OAUTH2_SECRET'), 'value')) # Example for a setting
```
If you are running a kubernetes based deployment, you can execute awx-manage like this:
```bash
$ kubectl exec --stdin --tty [instance name]-task-[...] -c [instance name]-task -- awx-manage shell_plus
```

9
licenses/deprecated.txt Normal file
View File

@@ -0,0 +1,9 @@
The MIT License (MIT)
Copyright (c) 2017 Laurent LAPORTE
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Binary file not shown.

165
licenses/jwcrypto.txt Normal file
View File

@@ -0,0 +1,165 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
This version of the GNU Lesser General Public License incorporates
the terms and conditions of version 3 of the GNU General Public
License, supplemented by the additional permissions listed below.
0. Additional Definitions.
As used herein, "this License" refers to version 3 of the GNU Lesser
General Public License, and the "GNU GPL" refers to version 3 of the GNU
General Public License.
"The Library" refers to a covered work governed by this License,
other than an Application or a Combined Work as defined below.
An "Application" is any work that makes use of an interface provided
by the Library, but which is not otherwise based on the Library.
Defining a subclass of a class defined by the Library is deemed a mode
of using an interface provided by the Library.
A "Combined Work" is a work produced by combining or linking an
Application with the Library. The particular version of the Library
with which the Combined Work was made is also called the "Linked
Version".
The "Minimal Corresponding Source" for a Combined Work means the
Corresponding Source for the Combined Work, excluding any source code
for portions of the Combined Work that, considered in isolation, are
based on the Application, and not on the Linked Version.
The "Corresponding Application Code" for a Combined Work means the
object code and/or source code for the Application, including any data
and utility programs needed for reproducing the Combined Work from the
Application, but excluding the System Libraries of the Combined Work.
1. Exception to Section 3 of the GNU GPL.
You may convey a covered work under sections 3 and 4 of this License
without being bound by section 3 of the GNU GPL.
2. Conveying Modified Versions.
If you modify a copy of the Library, and, in your modifications, a
facility refers to a function or data to be supplied by an Application
that uses the facility (other than as an argument passed when the
facility is invoked), then you may convey a copy of the modified
version:
a) under this License, provided that you make a good faith effort to
ensure that, in the event an Application does not supply the
function or data, the facility still operates, and performs
whatever part of its purpose remains meaningful, or
b) under the GNU GPL, with none of the additional permissions of
this License applicable to that copy.
3. Object Code Incorporating Material from Library Header Files.
The object code form of an Application may incorporate material from
a header file that is part of the Library. You may convey such object
code under terms of your choice, provided that, if the incorporated
material is not limited to numerical parameters, data structure
layouts and accessors, or small macros, inline functions and templates
(ten or fewer lines in length), you do both of the following:
a) Give prominent notice with each copy of the object code that the
Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the object code with a copy of the GNU GPL and this license
document.
4. Combined Works.
You may convey a Combined Work under terms of your choice that,
taken together, effectively do not restrict modification of the
portions of the Library contained in the Combined Work and reverse
engineering for debugging such modifications, if you also do each of
the following:
a) Give prominent notice with each copy of the Combined Work that
the Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the Combined Work with a copy of the GNU GPL and this license
document.
c) For a Combined Work that displays copyright notices during
execution, include the copyright notice for the Library among
these notices, as well as a reference directing the user to the
copies of the GNU GPL and this license document.
d) Do one of the following:
0) Convey the Minimal Corresponding Source under the terms of this
License, and the Corresponding Application Code in a form
suitable for, and under terms that permit, the user to
recombine or relink the Application with a modified version of
the Linked Version to produce a modified Combined Work, in the
manner specified by section 6 of the GNU GPL for conveying
Corresponding Source.
1) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (a) uses at run time
a copy of the Library already present on the user's computer
system, and (b) will operate properly with a modified version
of the Library that is interface-compatible with the Linked
Version.
e) Provide Installation Information, but only if you would otherwise
be required to provide such information under section 6 of the
GNU GPL, and only to the extent that such information is
necessary to install and execute a modified version of the
Combined Work produced by recombining or relinking the
Application with a modified version of the Linked Version. (If
you use option 4d0, the Installation Information must accompany
the Minimal Corresponding Source and Corresponding Application
Code. If you use option 4d1, you must provide the Installation
Information in the manner specified by section 6 of the GNU GPL
for conveying Corresponding Source.)
5. Combined Libraries.
You may place library facilities that are a work based on the
Library side by side in a single library together with other library
facilities that are not Applications and are not covered by this
License, and convey such a combined library under terms of your
choice, if you do both of the following:
a) Accompany the combined library with a copy of the same work based
on the Library, uncombined with any other library facilities,
conveyed under the terms of this License.
b) Give prominent notice with the combined library that part of it
is a work based on the Library, and explaining where to find the
accompanying uncombined form of the same work.
6. Revised Versions of the GNU Lesser General Public License.
The Free Software Foundation may publish revised and/or new versions
of the GNU Lesser General Public License from time to time. Such new
versions will be similar in spirit to the present version, but may
differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the
Library as you received it specifies that a certain numbered version
of the GNU Lesser General Public License "or any later version"
applies to it, you have the option of following the terms and
conditions either of that published version or of any later version
published by the Free Software Foundation. If the Library as you
received it does not specify a version number of the GNU Lesser
General Public License, you may choose any version of the GNU Lesser
General Public License ever published by the Free Software Foundation.
If the Library as you received it specifies that a proxy can decide
whether future versions of the GNU Lesser General Public License shall
apply, that proxy's public statement of acceptance of any version is
permanent authorization for you to choose that version for the
Library.

Binary file not shown.

Binary file not shown.

24
licenses/wrapt.txt Normal file
View File

@@ -0,0 +1,24 @@
Copyright (c) 2013-2023, Graham Dumpleton
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.

View File

@@ -1,5 +1,5 @@
[pytest]
DJANGO_SETTINGS_MODULE = awx.settings.development
DJANGO_SETTINGS_MODULE = awx.main.tests.settings_for_test
python_paths = /var/lib/awx/venv/tower/lib/python3.8/site-packages
site_dirs = /var/lib/awx/venv/tower/lib/python3.8/site-packages
python_files = *.py

View File

@@ -9,20 +9,20 @@ cryptography
Cython<3 # Since the bump to PyYAML 5.4.1 this is now a mandatory dep
daphne
distro
django==3.2.16 # see UPGRADE BLOCKERs https://github.com/ansible/awx/security/dependabot/67
django==4.2 # see UPGRADE BLOCKERs
django-auth-ldap
django-cors-headers
django-crum
django-extensions
django-guid==3.2.1
django-oauth-toolkit==1.4.1
django-oauth-toolkit<2.0.0 # Version 2.0.0 has breaking changes that will need to be worked out before upgrading
django-polymorphic
django-pglocks
django-redis
django-solo
django-split-settings==1.0.0 # We hit a strange issue where the release process errored when upgrading past 1.0.0 see UPGRADE BLOCKERS
django-taggit
djangorestframework==3.13.1
djangorestframework
djangorestframework-yaml
filelock
GitPython>=3.1.30 # CVE-2022-24439

View File

@@ -11,7 +11,7 @@ ansiconv==1.0.0
# via -r /awx_devel/requirements/requirements.in
asciichartpy==1.5.25
# via -r /awx_devel/requirements/requirements.in
asgiref==3.5.2
asgiref==3.6.0
# via
# channels
# channels-redis
@@ -74,6 +74,7 @@ cryptography==38.0.4
# adal
# autobahn
# azure-keyvault
# jwcrypto
# pyopenssl
# service-identity
# social-auth-core
@@ -91,9 +92,11 @@ defusedxml==0.7.1
# via
# python3-openid
# social-auth-core
deprecated==1.2.13
# via jwcrypto
distro==1.8.0
# via -r /awx_devel/requirements/requirements.in
django==3.2.16
django==4.2
# via
# -r /awx_devel/requirements/requirements.in
# channels
@@ -118,7 +121,7 @@ django-extensions==3.2.1
# via -r /awx_devel/requirements/requirements.in
django-guid==3.2.1
# via -r /awx_devel/requirements/requirements.in
django-oauth-toolkit==1.4.1
django-oauth-toolkit==1.7.1
# via -r /awx_devel/requirements/requirements.in
django-pglocks==1.0.4
# via -r /awx_devel/requirements/requirements.in
@@ -133,7 +136,7 @@ django-split-settings==1.0.0
# via -r /awx_devel/requirements/requirements.in
django-taggit==3.1.0
# via -r /awx_devel/requirements/requirements.in
djangorestframework==3.13.1
djangorestframework==3.14.0
# via -r /awx_devel/requirements/requirements.in
djangorestframework-yaml==2.0.0
# via -r /awx_devel/requirements/requirements.in
@@ -210,6 +213,8 @@ json-log-formatter==0.5.1
# via -r /awx_devel/requirements/requirements.in
jsonschema==4.17.3
# via -r /awx_devel/requirements/requirements.in
jwcrypto==1.4.2
# via django-oauth-toolkit
kubernetes==25.3.0
# via openshift
lockfile==0.12.2
@@ -266,7 +271,7 @@ prometheus-client==0.15.0
# via -r /awx_devel/requirements/requirements.in
psutil==5.9.4
# via -r /awx_devel/requirements/requirements.in
psycopg==3.1.4
psycopg==3.1.8
# via -r /awx_devel/requirements/requirements.in
psycopg2==2.9.5
# via -r /awx_devel/requirements/requirements.in
@@ -329,7 +334,6 @@ python3-openid==3.2.0
# via -r /awx_devel/requirements/requirements_git.txt
pytz==2022.6
# via
# django
# djangorestframework
# irc
# tempora
@@ -444,6 +448,8 @@ websocket-client==1.4.2
# via kubernetes
wheel==0.38.4
# via -r /awx_devel/requirements/requirements.in
wrapt==1.15.0
# via deprecated
xmlsec==1.3.13
# via python3-saml
yarl==1.8.1

View File

@@ -4,8 +4,7 @@
gather_facts: true
tasks:
- name: Get version from SCM if not explicitly provided
shell: |
make print-VERSION | cut -d + -f -1
command: make version-for-buildyml
args:
chdir: '../../'
register: scm_version

View File

@@ -215,6 +215,13 @@ ADD tools/docker-compose/entrypoint.sh /entrypoint.sh
ADD tools/scripts/config-watcher /usr/bin/config-watcher
ADD tools/docker-compose/containers.conf /etc/containers/containers.conf
ADD tools/docker-compose/podman-containers.conf /var/lib/awx/.config/containers/containers.conf
{% elif kube_dev|bool %}
RUN ln -sf /awx_devel/tools/ansible/roles/dockerfile/files/launch_awx_web.sh /usr/bin/launch_awx_web.sh
RUN ln -sf /awx_devel/tools/ansible/roles/dockerfile/files/launch_awx_task.sh /usr/bin/launch_awx_task.sh
RUN ln -sf /awx_devel/tools/ansible/roles/dockerfile/files/launch_awx_rsyslog.sh /usr/bin/launch_awx_rsyslog.sh
RUN ln -sf /awx_devel/{{ template_dest }}/supervisor_web.conf /etc/supervisord_web.conf
RUN ln -sf /awx_devel/{{ template_dest }}/supervisor_task.conf /etc/supervisord_task.conf
RUN ln -sf /awx_devel/{{ template_dest }}/supervisor_rsyslog.conf /etc/supervisord_rsyslog.conf
{% else %}
ADD tools/ansible/roles/dockerfile/files/launch_awx_web.sh /usr/bin/launch_awx_web.sh
ADD tools/ansible/roles/dockerfile/files/launch_awx_task.sh /usr/bin/launch_awx_task.sh
@@ -223,8 +230,9 @@ ADD {{ template_dest }}/supervisor_web.conf /etc/supervisord_web.conf
ADD {{ template_dest }}/supervisor_task.conf /etc/supervisord_task.conf
ADD {{ template_dest }}/supervisor_rsyslog.conf /etc/supervisord_rsyslog.conf
{% endif %}
{% if (build_dev|bool) or (kube_dev|bool) %}
ADD tools/docker-compose/awx.egg-link /tmp/awx.egg-link
RUN echo /awx_devel > /var/lib/awx/venv/awx/lib/python3.9/site-packages/awx.egg-link
ADD tools/docker-compose/awx-manage /usr/local/bin/awx-manage
ADD tools/scripts/awx-python /usr/bin/awx-python
{% endif %}
@@ -277,8 +285,7 @@ RUN for dir in \
/var/lib/shared/overlay-layers/layers.lock \
/var/lib/shared/vfs-images/images.lock \
/var/lib/shared/vfs-layers/layers.lock \
/var/run/nginx.pid \
/var/lib/awx/venv/awx/lib/python3.9/site-packages/awx.egg-link ; \
/var/run/nginx.pid; \
do touch $file ; chmod g+rw $file ; done && \
echo "\setenv PAGER 'less -SXF'" > /var/lib/awx/.psqlrc
{% endif %}

View File

@@ -33,6 +33,19 @@ stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0
{% if kube_dev | bool %}
[program:awx-autoreload]
command = /awx_devel/tools/docker-compose/awx-autoreload /awx_devel/awx 'supervisorctl -c /etc/supervisord_rsyslog.conf restart tower-processes:*'
autostart = true
autorestart = true
stopasgroup=true
killasgroup=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0
{% endif %}
[group:tower-processes]
programs=awx-rsyslog-configurer,awx-rsyslogd
priority=5

View File

@@ -56,6 +56,19 @@ stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0
{% if kube_dev | bool %}
[program:awx-autoreload]
command = /awx_devel/tools/docker-compose/awx-autoreload /awx_devel/awx 'supervisorctl -c /etc/supervisord_task.conf restart tower-processes:*'
autostart = true
autorestart = true
stopasgroup=true
killasgroup=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stderr
stderr_logfile_maxbytes=0
{% endif %}
[group:tower-processes]
programs=dispatcher,callback-receiver,wsrelay
priority=5

Some files were not shown because too many files have changed in this diff Show More