3
.gitignore
vendored
@@ -34,8 +34,6 @@ awx/ui_next/coverage/
|
|||||||
awx/ui_next/build
|
awx/ui_next/build
|
||||||
awx/ui_next/.env.local
|
awx/ui_next/.env.local
|
||||||
rsyslog.pid
|
rsyslog.pid
|
||||||
/tower-license
|
|
||||||
/tower-license/**
|
|
||||||
tools/prometheus/data
|
tools/prometheus/data
|
||||||
tools/docker-compose/Dockerfile
|
tools/docker-compose/Dockerfile
|
||||||
|
|
||||||
@@ -147,3 +145,4 @@ use_dev_supervisor.txt
|
|||||||
.idea/*
|
.idea/*
|
||||||
*.unison.tmp
|
*.unison.tmp
|
||||||
*.#
|
*.#
|
||||||
|
/tools/docker-compose/overrides/
|
||||||
|
|||||||
16
CHANGELOG.md
@@ -2,6 +2,22 @@
|
|||||||
|
|
||||||
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
|
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
|
||||||
|
|
||||||
|
## 15.0.1 (October 20, 2020)
|
||||||
|
- Added several optimizations to improve performance for a variety of high-load simultaneous job launch use cases https://github.com/ansible/awx/pull/8403
|
||||||
|
- Added the ability to source roles and collections from requirements.yaml files (not just requirements.yml) - https://github.com/ansible/awx/issues/4540
|
||||||
|
- awx.awx collection modules now provide a clearer error message for incompatible versions of awxkit - https://github.com/ansible/awx/issues/8127
|
||||||
|
- Fixed a bug in notification messages that contain certain unicode characters - https://github.com/ansible/awx/issues/7400
|
||||||
|
- Fixed a bug that prevents the deletion of Workflow Approval records - https://github.com/ansible/awx/issues/8305
|
||||||
|
- Fixed a bug that broke the selection of webhook credentials - https://github.com/ansible/awx/issues/7892
|
||||||
|
- Fixed a bug which can cause confusing behavior for social auth logins across distinct browser tabs - https://github.com/ansible/awx/issues/8154
|
||||||
|
- Fixed several bugs in the output of Workflow Job Templates using the `awx export` tool - https://github.com/ansible/awx/issues/7798 https://github.com/ansible/awx/pull/7847
|
||||||
|
- Fixed a race condition that can lead to missing hosts when running parallel inventory syncs - https://github.com/ansible/awx/issues/5571
|
||||||
|
- Fixed an HTTP 500 error when certain LDAP group parameters aren't properly set - https://github.com/ansible/awx/issues/7622
|
||||||
|
- Updated a few dependencies in response to several CVEs:
|
||||||
|
* CVE-2020-7720
|
||||||
|
* CVE-2020-7743
|
||||||
|
* CVE-2020-7676
|
||||||
|
|
||||||
## 15.0.0 (September 30, 2020)
|
## 15.0.0 (September 30, 2020)
|
||||||
- Added improved support for fetching Ansible collections from private Galaxy content sources (such as https://github.com/ansible/galaxy_ng) - https://github.com/ansible/awx/issues/7813
|
- Added improved support for fetching Ansible collections from private Galaxy content sources (such as https://github.com/ansible/galaxy_ng) - https://github.com/ansible/awx/issues/7813
|
||||||
**Note:** as part of this change, new Organizations created in the AWX API will _no longer_ automatically synchronize roles and collections from galaxy.ansible.com by default. More details on this change can be found at: https://github.com/ansible/awx/issues/8341#issuecomment-707310633
|
**Note:** as part of this change, new Organizations created in the AWX API will _no longer_ automatically synchronize roles and collections from galaxy.ansible.com by default. More details on this change can be found at: https://github.com/ansible/awx/issues/8341#issuecomment-707310633
|
||||||
|
|||||||
@@ -78,6 +78,8 @@ Before you can run a deployment, you'll need the following installed in your loc
|
|||||||
- [docker](https://pypi.org/project/docker/) Python module
|
- [docker](https://pypi.org/project/docker/) Python module
|
||||||
+ This is incompatible with `docker-py`. If you have previously installed `docker-py`, please uninstall it.
|
+ This is incompatible with `docker-py`. If you have previously installed `docker-py`, please uninstall it.
|
||||||
+ We use this module instead of `docker-py` because it is what the `docker-compose` Python module requires.
|
+ We use this module instead of `docker-py` because it is what the `docker-compose` Python module requires.
|
||||||
|
- [community.general.docker_image collection](https://docs.ansible.com/ansible/latest/collections/community/general/docker_image_module.html)
|
||||||
|
+ This is only required if you are using Ansible >= 2.10
|
||||||
- [GNU Make](https://www.gnu.org/software/make/)
|
- [GNU Make](https://www.gnu.org/software/make/)
|
||||||
- [Git](https://git-scm.com/) Requires Version 1.8.4+
|
- [Git](https://git-scm.com/) Requires Version 1.8.4+
|
||||||
- Python 3.6+
|
- Python 3.6+
|
||||||
|
|||||||
@@ -4,8 +4,6 @@ recursive-include awx *.mo
|
|||||||
recursive-include awx/static *
|
recursive-include awx/static *
|
||||||
recursive-include awx/templates *.html
|
recursive-include awx/templates *.html
|
||||||
recursive-include awx/api/templates *.md *.html
|
recursive-include awx/api/templates *.md *.html
|
||||||
recursive-include awx/ui/templates *.html
|
|
||||||
recursive-include awx/ui/static *
|
|
||||||
recursive-include awx/ui_next/build *.html
|
recursive-include awx/ui_next/build *.html
|
||||||
recursive-include awx/ui_next/build *
|
recursive-include awx/ui_next/build *
|
||||||
recursive-include awx/playbooks *.yml
|
recursive-include awx/playbooks *.yml
|
||||||
|
|||||||
145
Makefile
@@ -56,11 +56,6 @@ WHEEL_COMMAND ?= bdist_wheel
|
|||||||
SDIST_TAR_FILE ?= $(SDIST_TAR_NAME).tar.gz
|
SDIST_TAR_FILE ?= $(SDIST_TAR_NAME).tar.gz
|
||||||
WHEEL_FILE ?= $(WHEEL_NAME)-py2-none-any.whl
|
WHEEL_FILE ?= $(WHEEL_NAME)-py2-none-any.whl
|
||||||
|
|
||||||
# UI flag files
|
|
||||||
UI_DEPS_FLAG_FILE = awx/ui/.deps_built
|
|
||||||
UI_RELEASE_DEPS_FLAG_FILE = awx/ui/.release_deps_built
|
|
||||||
UI_RELEASE_FLAG_FILE = awx/ui/.release_built
|
|
||||||
|
|
||||||
I18N_FLAG_FILE = .i18n_built
|
I18N_FLAG_FILE = .i18n_built
|
||||||
|
|
||||||
.PHONY: awx-link clean clean-tmp clean-venv requirements requirements_dev \
|
.PHONY: awx-link clean clean-tmp clean-venv requirements requirements_dev \
|
||||||
@@ -70,22 +65,6 @@ I18N_FLAG_FILE = .i18n_built
|
|||||||
ui-docker-machine ui-docker ui-release ui-devel \
|
ui-docker-machine ui-docker ui-release ui-devel \
|
||||||
ui-test ui-deps ui-test-ci VERSION
|
ui-test ui-deps ui-test-ci VERSION
|
||||||
|
|
||||||
# remove ui build artifacts
|
|
||||||
clean-ui: clean-languages
|
|
||||||
rm -rf awx/ui/static/
|
|
||||||
rm -rf awx/ui/node_modules/
|
|
||||||
rm -rf awx/ui/test/unit/reports/
|
|
||||||
rm -rf awx/ui/test/spec/reports/
|
|
||||||
rm -rf awx/ui/test/e2e/reports/
|
|
||||||
rm -rf awx/ui/client/languages/
|
|
||||||
rm -rf awx/ui_next/node_modules/
|
|
||||||
rm -rf node_modules
|
|
||||||
rm -rf awx/ui_next/coverage/
|
|
||||||
rm -rf awx/ui_next/build/locales/_build/
|
|
||||||
rm -f $(UI_DEPS_FLAG_FILE)
|
|
||||||
rm -f $(UI_RELEASE_DEPS_FLAG_FILE)
|
|
||||||
rm -f $(UI_RELEASE_FLAG_FILE)
|
|
||||||
|
|
||||||
clean-tmp:
|
clean-tmp:
|
||||||
rm -rf tmp/
|
rm -rf tmp/
|
||||||
|
|
||||||
@@ -214,7 +193,11 @@ requirements_awx_dev:
|
|||||||
|
|
||||||
requirements_collections:
|
requirements_collections:
|
||||||
mkdir -p $(COLLECTION_BASE)
|
mkdir -p $(COLLECTION_BASE)
|
||||||
ansible-galaxy collection install -r requirements/collections_requirements.yml -p $(COLLECTION_BASE)
|
n=0; \
|
||||||
|
until [ "$$n" -ge 5 ]; do \
|
||||||
|
ansible-galaxy collection install -r requirements/collections_requirements.yml -p $(COLLECTION_BASE) && break; \
|
||||||
|
n=$$((n+1)); \
|
||||||
|
done
|
||||||
|
|
||||||
requirements: requirements_ansible requirements_awx requirements_collections
|
requirements: requirements_ansible requirements_awx requirements_collections
|
||||||
|
|
||||||
@@ -476,110 +459,23 @@ else
|
|||||||
@echo No PO files
|
@echo No PO files
|
||||||
endif
|
endif
|
||||||
|
|
||||||
# generate UI .pot
|
|
||||||
pot: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run pot
|
|
||||||
|
|
||||||
# generate django .pot .po
|
|
||||||
LANG = "en-us"
|
|
||||||
messages:
|
|
||||||
@if [ "$(VENV_BASE)" ]; then \
|
|
||||||
. $(VENV_BASE)/awx/bin/activate; \
|
|
||||||
fi; \
|
|
||||||
$(PYTHON) manage.py makemessages -l $(LANG) --keep-pot
|
|
||||||
|
|
||||||
# generate l10n .json .mo
|
|
||||||
languages: $(I18N_FLAG_FILE)
|
|
||||||
|
|
||||||
$(I18N_FLAG_FILE): $(UI_RELEASE_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run languages
|
|
||||||
$(PYTHON) tools/scripts/compilemessages.py
|
|
||||||
touch $(I18N_FLAG_FILE)
|
|
||||||
|
|
||||||
# End l10n TASKS
|
|
||||||
# --------------------------------------
|
|
||||||
|
|
||||||
# UI RELEASE TASKS
|
|
||||||
# --------------------------------------
|
|
||||||
ui-release: $(UI_RELEASE_FLAG_FILE)
|
|
||||||
|
|
||||||
$(UI_RELEASE_FLAG_FILE): $(I18N_FLAG_FILE) $(UI_RELEASE_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run build-release
|
|
||||||
touch $(UI_RELEASE_FLAG_FILE)
|
|
||||||
|
|
||||||
$(UI_RELEASE_DEPS_FLAG_FILE):
|
|
||||||
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1 $(NPM_BIN) --unsafe-perm --prefix awx/ui ci --no-save awx/ui
|
|
||||||
touch $(UI_RELEASE_DEPS_FLAG_FILE)
|
|
||||||
|
|
||||||
# END UI RELEASE TASKS
|
|
||||||
# --------------------------------------
|
|
||||||
|
|
||||||
# UI TASKS
|
# UI TASKS
|
||||||
# --------------------------------------
|
# --------------------------------------
|
||||||
ui-deps: $(UI_DEPS_FLAG_FILE)
|
|
||||||
|
|
||||||
$(UI_DEPS_FLAG_FILE):
|
|
||||||
@if [ -f ${UI_RELEASE_DEPS_FLAG_FILE} ]; then \
|
|
||||||
rm -rf awx/ui/node_modules; \
|
|
||||||
rm -f ${UI_RELEASE_DEPS_FLAG_FILE}; \
|
|
||||||
fi; \
|
|
||||||
$(NPM_BIN) --unsafe-perm --prefix awx/ui ci --no-save awx/ui
|
|
||||||
touch $(UI_DEPS_FLAG_FILE)
|
|
||||||
|
|
||||||
ui-docker-machine: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run ui-docker-machine -- $(MAKEFLAGS)
|
|
||||||
|
|
||||||
# Native docker. Builds UI and raises BrowserSync & filesystem polling.
|
|
||||||
ui-docker: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run ui-docker -- $(MAKEFLAGS)
|
|
||||||
|
|
||||||
# Builds UI with development UI without raising browser-sync or filesystem polling.
|
|
||||||
ui-devel: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run build-devel -- $(MAKEFLAGS)
|
|
||||||
|
|
||||||
ui-test: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run test
|
|
||||||
|
|
||||||
ui-lint: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) run --prefix awx/ui jshint
|
|
||||||
$(NPM_BIN) run --prefix awx/ui lint
|
|
||||||
|
|
||||||
# A standard go-to target for API developers to use building the frontend
|
|
||||||
ui: clean-ui ui-devel
|
|
||||||
|
|
||||||
ui-test-ci: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) --prefix awx/ui run test:ci
|
|
||||||
$(NPM_BIN) --prefix awx/ui run unit
|
|
||||||
|
|
||||||
jshint: $(UI_DEPS_FLAG_FILE)
|
|
||||||
$(NPM_BIN) run --prefix awx/ui jshint
|
|
||||||
$(NPM_BIN) run --prefix awx/ui lint
|
|
||||||
|
|
||||||
ui-zuul-lint-and-test:
|
|
||||||
CHROMIUM_BIN=$(CHROMIUM_BIN) ./awx/ui/build/zuul_download_chromium.sh
|
|
||||||
CHROMIUM_BIN=$(CHROMIUM_BIN) PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1 $(NPM_BIN) --unsafe-perm --prefix awx/ui ci --no-save awx/ui
|
|
||||||
CHROMIUM_BIN=$(CHROMIUM_BIN) $(NPM_BIN) run --prefix awx/ui jshint
|
|
||||||
CHROMIUM_BIN=$(CHROMIUM_BIN) $(NPM_BIN) run --prefix awx/ui lint
|
|
||||||
CHROME_BIN=$(CHROMIUM_BIN) $(NPM_BIN) --prefix awx/ui run test:ci
|
|
||||||
CHROME_BIN=$(CHROMIUM_BIN) $(NPM_BIN) --prefix awx/ui run unit
|
|
||||||
|
|
||||||
# END UI TASKS
|
|
||||||
# --------------------------------------
|
|
||||||
|
|
||||||
# UI NEXT TASKS
|
|
||||||
# --------------------------------------
|
|
||||||
|
|
||||||
awx/ui_next/node_modules:
|
awx/ui_next/node_modules:
|
||||||
$(NPM_BIN) --prefix awx/ui_next install
|
$(NPM_BIN) --prefix awx/ui_next install
|
||||||
|
|
||||||
ui-release-next:
|
clean-ui:
|
||||||
mkdir -p awx/ui_next/build/static
|
rm -rf node_modules
|
||||||
touch awx/ui_next/build/static/.placeholder
|
rm -rf awx/ui_next/node_modules
|
||||||
|
rm -rf awx/ui_next/build
|
||||||
|
|
||||||
ui-devel-next: awx/ui_next/node_modules
|
ui-release: ui-devel
|
||||||
|
ui-devel: awx/ui_next/node_modules
|
||||||
$(NPM_BIN) --prefix awx/ui_next run extract-strings
|
$(NPM_BIN) --prefix awx/ui_next run extract-strings
|
||||||
$(NPM_BIN) --prefix awx/ui_next run compile-strings
|
$(NPM_BIN) --prefix awx/ui_next run compile-strings
|
||||||
$(NPM_BIN) --prefix awx/ui_next run build
|
$(NPM_BIN) --prefix awx/ui_next run build
|
||||||
|
git checkout awx/ui_next/src/locales
|
||||||
mkdir -p awx/public/static/css
|
mkdir -p awx/public/static/css
|
||||||
mkdir -p awx/public/static/js
|
mkdir -p awx/public/static/js
|
||||||
mkdir -p awx/public/static/media
|
mkdir -p awx/public/static/media
|
||||||
@@ -587,19 +483,12 @@ ui-devel-next: awx/ui_next/node_modules
|
|||||||
cp -r awx/ui_next/build/static/js/* awx/public/static/js
|
cp -r awx/ui_next/build/static/js/* awx/public/static/js
|
||||||
cp -r awx/ui_next/build/static/media/* awx/public/static/media
|
cp -r awx/ui_next/build/static/media/* awx/public/static/media
|
||||||
|
|
||||||
clean-ui-next:
|
ui-zuul-lint-and-test:
|
||||||
rm -rf node_modules
|
|
||||||
rm -rf awx/ui_next/node_modules
|
|
||||||
rm -rf awx/ui_next/build
|
|
||||||
|
|
||||||
ui-next-zuul-lint-and-test:
|
|
||||||
$(NPM_BIN) --prefix awx/ui_next install
|
$(NPM_BIN) --prefix awx/ui_next install
|
||||||
$(NPM_BIN) run --prefix awx/ui_next lint
|
$(NPM_BIN) run --prefix awx/ui_next lint
|
||||||
$(NPM_BIN) run --prefix awx/ui_next prettier-check
|
$(NPM_BIN) run --prefix awx/ui_next prettier-check
|
||||||
$(NPM_BIN) run --prefix awx/ui_next test
|
$(NPM_BIN) run --prefix awx/ui_next test
|
||||||
|
|
||||||
# END UI NEXT TASKS
|
|
||||||
# --------------------------------------
|
|
||||||
|
|
||||||
# Build a pip-installable package into dist/ with a timestamped version number.
|
# Build a pip-installable package into dist/ with a timestamped version number.
|
||||||
dev_build:
|
dev_build:
|
||||||
@@ -609,10 +498,10 @@ dev_build:
|
|||||||
release_build:
|
release_build:
|
||||||
$(PYTHON) setup.py release_build
|
$(PYTHON) setup.py release_build
|
||||||
|
|
||||||
dist/$(SDIST_TAR_FILE): ui-release ui-release-next VERSION
|
dist/$(SDIST_TAR_FILE): ui-release VERSION
|
||||||
$(PYTHON) setup.py $(SDIST_COMMAND)
|
$(PYTHON) setup.py $(SDIST_COMMAND)
|
||||||
|
|
||||||
dist/$(WHEEL_FILE): ui-release ui-release-next
|
dist/$(WHEEL_FILE): ui-release
|
||||||
$(PYTHON) setup.py $(WHEEL_COMMAND)
|
$(PYTHON) setup.py $(WHEEL_COMMAND)
|
||||||
|
|
||||||
sdist: dist/$(SDIST_TAR_FILE)
|
sdist: dist/$(SDIST_TAR_FILE)
|
||||||
@@ -646,9 +535,11 @@ awx/projects:
|
|||||||
docker-compose-isolated: awx/projects
|
docker-compose-isolated: awx/projects
|
||||||
CURRENT_UID=$(shell id -u) TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml -f tools/docker-isolated-override.yml up
|
CURRENT_UID=$(shell id -u) TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml -f tools/docker-isolated-override.yml up
|
||||||
|
|
||||||
|
COMPOSE_UP_OPTS ?=
|
||||||
|
|
||||||
# Docker Compose Development environment
|
# Docker Compose Development environment
|
||||||
docker-compose: docker-auth awx/projects
|
docker-compose: docker-auth awx/projects
|
||||||
CURRENT_UID=$(shell id -u) OS="$(shell docker info | grep 'Operating System')" TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml up --no-recreate awx
|
CURRENT_UID=$(shell id -u) OS="$(shell docker info | grep 'Operating System')" TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose.yml $(COMPOSE_UP_OPTS) up --no-recreate awx
|
||||||
|
|
||||||
docker-compose-cluster: docker-auth awx/projects
|
docker-compose-cluster: docker-auth awx/projects
|
||||||
CURRENT_UID=$(shell id -u) TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose-cluster.yml up
|
CURRENT_UID=$(shell id -u) TAG=$(COMPOSE_TAG) DEV_DOCKER_TAG_BASE=$(DEV_DOCKER_TAG_BASE) docker-compose -f tools/docker-compose-cluster.yml up
|
||||||
|
|||||||
@@ -47,8 +47,6 @@ from awx.main.utils import (
|
|||||||
get_object_or_400,
|
get_object_or_400,
|
||||||
decrypt_field,
|
decrypt_field,
|
||||||
get_awx_version,
|
get_awx_version,
|
||||||
get_licenser,
|
|
||||||
StubLicense
|
|
||||||
)
|
)
|
||||||
from awx.main.utils.db import get_all_field_names
|
from awx.main.utils.db import get_all_field_names
|
||||||
from awx.main.views import ApiErrorView
|
from awx.main.views import ApiErrorView
|
||||||
@@ -189,7 +187,8 @@ class APIView(views.APIView):
|
|||||||
'''
|
'''
|
||||||
Log warning for 400 requests. Add header with elapsed time.
|
Log warning for 400 requests. Add header with elapsed time.
|
||||||
'''
|
'''
|
||||||
|
from awx.main.utils import get_licenser
|
||||||
|
from awx.main.utils.licensing import OpenLicense
|
||||||
#
|
#
|
||||||
# If the URL was rewritten, and we get a 404, we should entirely
|
# If the URL was rewritten, and we get a 404, we should entirely
|
||||||
# replace the view in the request context with an ApiErrorView()
|
# replace the view in the request context with an ApiErrorView()
|
||||||
@@ -225,7 +224,8 @@ class APIView(views.APIView):
|
|||||||
response = super(APIView, self).finalize_response(request, response, *args, **kwargs)
|
response = super(APIView, self).finalize_response(request, response, *args, **kwargs)
|
||||||
time_started = getattr(self, 'time_started', None)
|
time_started = getattr(self, 'time_started', None)
|
||||||
response['X-API-Product-Version'] = get_awx_version()
|
response['X-API-Product-Version'] = get_awx_version()
|
||||||
response['X-API-Product-Name'] = 'AWX' if isinstance(get_licenser(), StubLicense) else 'Red Hat Ansible Tower'
|
response['X-API-Product-Name'] = 'AWX' if isinstance(get_licenser(), OpenLicense) else 'Red Hat Ansible Tower'
|
||||||
|
|
||||||
response['X-API-Node'] = settings.CLUSTER_HOST_ID
|
response['X-API-Node'] = settings.CLUSTER_HOST_ID
|
||||||
if time_started:
|
if time_started:
|
||||||
time_elapsed = time.time() - self.time_started
|
time_elapsed = time.time() - self.time_started
|
||||||
|
|||||||
@@ -453,7 +453,7 @@ class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetacl
|
|||||||
if 'capability_map' not in self.context:
|
if 'capability_map' not in self.context:
|
||||||
if hasattr(self, 'polymorphic_base'):
|
if hasattr(self, 'polymorphic_base'):
|
||||||
model = self.polymorphic_base.Meta.model
|
model = self.polymorphic_base.Meta.model
|
||||||
prefetch_list = self.polymorphic_base._capabilities_prefetch
|
prefetch_list = self.polymorphic_base.capabilities_prefetch
|
||||||
else:
|
else:
|
||||||
model = self.Meta.model
|
model = self.Meta.model
|
||||||
prefetch_list = self.capabilities_prefetch
|
prefetch_list = self.capabilities_prefetch
|
||||||
@@ -640,12 +640,9 @@ class EmptySerializer(serializers.Serializer):
|
|||||||
|
|
||||||
|
|
||||||
class UnifiedJobTemplateSerializer(BaseSerializer):
|
class UnifiedJobTemplateSerializer(BaseSerializer):
|
||||||
# As a base serializer, the capabilities prefetch is not used directly
|
# As a base serializer, the capabilities prefetch is not used directly,
|
||||||
_capabilities_prefetch = [
|
# instead they are derived from the Workflow Job Template Serializer and the Job Template Serializer, respectively.
|
||||||
'admin', 'execute',
|
capabilities_prefetch = []
|
||||||
{'copy': ['jobtemplate.project.use', 'jobtemplate.inventory.use',
|
|
||||||
'organization.workflow_admin']}
|
|
||||||
]
|
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = UnifiedJobTemplate
|
model = UnifiedJobTemplate
|
||||||
@@ -695,7 +692,7 @@ class UnifiedJobTemplateSerializer(BaseSerializer):
|
|||||||
serializer.polymorphic_base = self
|
serializer.polymorphic_base = self
|
||||||
# capabilities prefetch is only valid for these models
|
# capabilities prefetch is only valid for these models
|
||||||
if isinstance(obj, (JobTemplate, WorkflowJobTemplate)):
|
if isinstance(obj, (JobTemplate, WorkflowJobTemplate)):
|
||||||
serializer.capabilities_prefetch = self._capabilities_prefetch
|
serializer.capabilities_prefetch = serializer_class.capabilities_prefetch
|
||||||
else:
|
else:
|
||||||
serializer.capabilities_prefetch = None
|
serializer.capabilities_prefetch = None
|
||||||
return serializer.to_representation(obj)
|
return serializer.to_representation(obj)
|
||||||
@@ -1333,6 +1330,8 @@ class ProjectOptionsSerializer(BaseSerializer):
|
|||||||
scm_type = attrs.get('scm_type', u'') or u''
|
scm_type = attrs.get('scm_type', u'') or u''
|
||||||
if self.instance and not scm_type:
|
if self.instance and not scm_type:
|
||||||
valid_local_paths.append(self.instance.local_path)
|
valid_local_paths.append(self.instance.local_path)
|
||||||
|
if self.instance and scm_type and "local_path" in attrs and self.instance.local_path != attrs['local_path']:
|
||||||
|
errors['local_path'] = _(f'Cannot change local_path for {scm_type}-based projects')
|
||||||
if scm_type:
|
if scm_type:
|
||||||
attrs.pop('local_path', None)
|
attrs.pop('local_path', None)
|
||||||
if 'local_path' in attrs and attrs['local_path'] not in valid_local_paths:
|
if 'local_path' in attrs and attrs['local_path'] not in valid_local_paths:
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ The `period` of the data can be adjusted with:
|
|||||||
|
|
||||||
?period=month
|
?period=month
|
||||||
|
|
||||||
Where `month` can be replaced with `week`, or `day`. `month` is the default.
|
Where `month` can be replaced with `week`, `two_weeks`, or `day`. `month` is the default.
|
||||||
|
|
||||||
The type of job can be filtered with:
|
The type of job can be filtered with:
|
||||||
|
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ from awx.api.views import (
|
|||||||
ApiV2PingView,
|
ApiV2PingView,
|
||||||
ApiV2ConfigView,
|
ApiV2ConfigView,
|
||||||
ApiV2SubscriptionView,
|
ApiV2SubscriptionView,
|
||||||
|
ApiV2AttachView,
|
||||||
AuthView,
|
AuthView,
|
||||||
UserMeList,
|
UserMeList,
|
||||||
DashboardView,
|
DashboardView,
|
||||||
@@ -94,6 +95,7 @@ v2_urls = [
|
|||||||
url(r'^ping/$', ApiV2PingView.as_view(), name='api_v2_ping_view'),
|
url(r'^ping/$', ApiV2PingView.as_view(), name='api_v2_ping_view'),
|
||||||
url(r'^config/$', ApiV2ConfigView.as_view(), name='api_v2_config_view'),
|
url(r'^config/$', ApiV2ConfigView.as_view(), name='api_v2_config_view'),
|
||||||
url(r'^config/subscriptions/$', ApiV2SubscriptionView.as_view(), name='api_v2_subscription_view'),
|
url(r'^config/subscriptions/$', ApiV2SubscriptionView.as_view(), name='api_v2_subscription_view'),
|
||||||
|
url(r'^config/attach/$', ApiV2AttachView.as_view(), name='api_v2_attach_view'),
|
||||||
url(r'^auth/$', AuthView.as_view()),
|
url(r'^auth/$', AuthView.as_view()),
|
||||||
url(r'^me/$', UserMeList.as_view(), name='user_me_list'),
|
url(r'^me/$', UserMeList.as_view(), name='user_me_list'),
|
||||||
url(r'^dashboard/$', DashboardView.as_view(), name='dashboard_view'),
|
url(r'^dashboard/$', DashboardView.as_view(), name='dashboard_view'),
|
||||||
|
|||||||
@@ -153,6 +153,7 @@ from awx.api.views.root import ( # noqa
|
|||||||
ApiV2PingView,
|
ApiV2PingView,
|
||||||
ApiV2ConfigView,
|
ApiV2ConfigView,
|
||||||
ApiV2SubscriptionView,
|
ApiV2SubscriptionView,
|
||||||
|
ApiV2AttachView,
|
||||||
)
|
)
|
||||||
from awx.api.views.webhooks import ( # noqa
|
from awx.api.views.webhooks import ( # noqa
|
||||||
WebhookKeyView,
|
WebhookKeyView,
|
||||||
@@ -316,6 +317,9 @@ class DashboardJobsGraphView(APIView):
|
|||||||
if period == 'month':
|
if period == 'month':
|
||||||
end_date = start_date - dateutil.relativedelta.relativedelta(months=1)
|
end_date = start_date - dateutil.relativedelta.relativedelta(months=1)
|
||||||
interval = 'days'
|
interval = 'days'
|
||||||
|
elif period == 'two_weeks':
|
||||||
|
end_date = start_date - dateutil.relativedelta.relativedelta(weeks=2)
|
||||||
|
interval = 'days'
|
||||||
elif period == 'week':
|
elif period == 'week':
|
||||||
end_date = start_date - dateutil.relativedelta.relativedelta(weeks=1)
|
end_date = start_date - dateutil.relativedelta.relativedelta(weeks=1)
|
||||||
interval = 'days'
|
interval = 'days'
|
||||||
@@ -3043,7 +3047,7 @@ class WorkflowJobTemplateNodeCreateApproval(RetrieveAPIView):
|
|||||||
approval_template,
|
approval_template,
|
||||||
context=self.get_serializer_context()
|
context=self.get_serializer_context()
|
||||||
).data
|
).data
|
||||||
return Response(data, status=status.HTTP_200_OK)
|
return Response(data, status=status.HTTP_201_CREATED)
|
||||||
|
|
||||||
def check_permissions(self, request):
|
def check_permissions(self, request):
|
||||||
obj = self.get_object().workflow_job_template
|
obj = self.get_object().workflow_job_template
|
||||||
@@ -4253,7 +4257,9 @@ class NotificationTemplateDetail(RetrieveUpdateDestroyAPIView):
|
|||||||
obj = self.get_object()
|
obj = self.get_object()
|
||||||
if not request.user.can_access(self.model, 'delete', obj):
|
if not request.user.can_access(self.model, 'delete', obj):
|
||||||
return Response(status=status.HTTP_404_NOT_FOUND)
|
return Response(status=status.HTTP_404_NOT_FOUND)
|
||||||
if obj.notifications.filter(status='pending').exists():
|
|
||||||
|
hours_old = now() - dateutil.relativedelta.relativedelta(hours=8)
|
||||||
|
if obj.notifications.filter(status='pending', created__gt=hours_old).exists():
|
||||||
return Response({"error": _("Delete not allowed while there are pending notifications")},
|
return Response({"error": _("Delete not allowed while there are pending notifications")},
|
||||||
status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
||||||
return super(NotificationTemplateDetail, self).delete(request, *args, **kwargs)
|
return super(NotificationTemplateDetail, self).delete(request, *args, **kwargs)
|
||||||
|
|||||||
@@ -1,9 +1,10 @@
|
|||||||
# Copyright (c) 2018 Ansible, Inc.
|
# Copyright (c) 2018 Ansible, Inc.
|
||||||
# All Rights Reserved.
|
# All Rights Reserved.
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import json
|
||||||
import logging
|
import logging
|
||||||
import operator
|
import operator
|
||||||
import json
|
|
||||||
from collections import OrderedDict
|
from collections import OrderedDict
|
||||||
|
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
@@ -29,8 +30,8 @@ from awx.main.utils import (
|
|||||||
get_custom_venv_choices,
|
get_custom_venv_choices,
|
||||||
to_python_boolean,
|
to_python_boolean,
|
||||||
)
|
)
|
||||||
|
from awx.main.utils.licensing import validate_entitlement_manifest
|
||||||
from awx.api.versioning import reverse, drf_reverse
|
from awx.api.versioning import reverse, drf_reverse
|
||||||
from awx.conf.license import get_license
|
|
||||||
from awx.main.constants import PRIVILEGE_ESCALATION_METHODS
|
from awx.main.constants import PRIVILEGE_ESCALATION_METHODS
|
||||||
from awx.main.models import (
|
from awx.main.models import (
|
||||||
Project,
|
Project,
|
||||||
@@ -178,7 +179,7 @@ class ApiV2PingView(APIView):
|
|||||||
class ApiV2SubscriptionView(APIView):
|
class ApiV2SubscriptionView(APIView):
|
||||||
|
|
||||||
permission_classes = (IsAuthenticated,)
|
permission_classes = (IsAuthenticated,)
|
||||||
name = _('Configuration')
|
name = _('Subscriptions')
|
||||||
swagger_topic = 'System Configuration'
|
swagger_topic = 'System Configuration'
|
||||||
|
|
||||||
def check_permissions(self, request):
|
def check_permissions(self, request):
|
||||||
@@ -189,18 +190,18 @@ class ApiV2SubscriptionView(APIView):
|
|||||||
def post(self, request):
|
def post(self, request):
|
||||||
from awx.main.utils.common import get_licenser
|
from awx.main.utils.common import get_licenser
|
||||||
data = request.data.copy()
|
data = request.data.copy()
|
||||||
if data.get('rh_password') == '$encrypted$':
|
if data.get('subscriptions_password') == '$encrypted$':
|
||||||
data['rh_password'] = settings.REDHAT_PASSWORD
|
data['subscriptions_password'] = settings.SUBSCRIPTIONS_PASSWORD
|
||||||
try:
|
try:
|
||||||
user, pw = data.get('rh_username'), data.get('rh_password')
|
user, pw = data.get('subscriptions_username'), data.get('subscriptions_password')
|
||||||
with set_environ(**settings.AWX_TASK_ENV):
|
with set_environ(**settings.AWX_TASK_ENV):
|
||||||
validated = get_licenser().validate_rh(user, pw)
|
validated = get_licenser().validate_rh(user, pw)
|
||||||
if user:
|
if user:
|
||||||
settings.REDHAT_USERNAME = data['rh_username']
|
settings.SUBSCRIPTIONS_USERNAME = data['subscriptions_username']
|
||||||
if pw:
|
if pw:
|
||||||
settings.REDHAT_PASSWORD = data['rh_password']
|
settings.SUBSCRIPTIONS_PASSWORD = data['subscriptions_password']
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
msg = _("Invalid License")
|
msg = _("Invalid Subscription")
|
||||||
if (
|
if (
|
||||||
isinstance(exc, requests.exceptions.HTTPError) and
|
isinstance(exc, requests.exceptions.HTTPError) and
|
||||||
getattr(getattr(exc, 'response', None), 'status_code', None) == 401
|
getattr(getattr(exc, 'response', None), 'status_code', None) == 401
|
||||||
@@ -213,13 +214,63 @@ class ApiV2SubscriptionView(APIView):
|
|||||||
elif isinstance(exc, (ValueError, OSError)) and exc.args:
|
elif isinstance(exc, (ValueError, OSError)) and exc.args:
|
||||||
msg = exc.args[0]
|
msg = exc.args[0]
|
||||||
else:
|
else:
|
||||||
logger.exception(smart_text(u"Invalid license submitted."),
|
logger.exception(smart_text(u"Invalid subscription submitted."),
|
||||||
extra=dict(actor=request.user.username))
|
extra=dict(actor=request.user.username))
|
||||||
return Response({"error": msg}, status=status.HTTP_400_BAD_REQUEST)
|
return Response({"error": msg}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
return Response(validated)
|
return Response(validated)
|
||||||
|
|
||||||
|
|
||||||
|
class ApiV2AttachView(APIView):
|
||||||
|
|
||||||
|
permission_classes = (IsAuthenticated,)
|
||||||
|
name = _('Attach Subscription')
|
||||||
|
swagger_topic = 'System Configuration'
|
||||||
|
|
||||||
|
def check_permissions(self, request):
|
||||||
|
super(ApiV2AttachView, self).check_permissions(request)
|
||||||
|
if not request.user.is_superuser and request.method.lower() not in {'options', 'head'}:
|
||||||
|
self.permission_denied(request) # Raises PermissionDenied exception.
|
||||||
|
|
||||||
|
def post(self, request):
|
||||||
|
data = request.data.copy()
|
||||||
|
pool_id = data.get('pool_id', None)
|
||||||
|
if not pool_id:
|
||||||
|
return Response({"error": _("No subscription pool ID provided.")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
user = getattr(settings, 'SUBSCRIPTIONS_USERNAME', None)
|
||||||
|
pw = getattr(settings, 'SUBSCRIPTIONS_PASSWORD', None)
|
||||||
|
if pool_id and user and pw:
|
||||||
|
from awx.main.utils.common import get_licenser
|
||||||
|
data = request.data.copy()
|
||||||
|
try:
|
||||||
|
with set_environ(**settings.AWX_TASK_ENV):
|
||||||
|
validated = get_licenser().validate_rh(user, pw)
|
||||||
|
except Exception as exc:
|
||||||
|
msg = _("Invalid Subscription")
|
||||||
|
if (
|
||||||
|
isinstance(exc, requests.exceptions.HTTPError) and
|
||||||
|
getattr(getattr(exc, 'response', None), 'status_code', None) == 401
|
||||||
|
):
|
||||||
|
msg = _("The provided credentials are invalid (HTTP 401).")
|
||||||
|
elif isinstance(exc, requests.exceptions.ProxyError):
|
||||||
|
msg = _("Unable to connect to proxy server.")
|
||||||
|
elif isinstance(exc, requests.exceptions.ConnectionError):
|
||||||
|
msg = _("Could not connect to subscription service.")
|
||||||
|
elif isinstance(exc, (ValueError, OSError)) and exc.args:
|
||||||
|
msg = exc.args[0]
|
||||||
|
else:
|
||||||
|
logger.exception(smart_text(u"Invalid subscription submitted."),
|
||||||
|
extra=dict(actor=request.user.username))
|
||||||
|
return Response({"error": msg}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
for sub in validated:
|
||||||
|
if sub['pool_id'] == pool_id:
|
||||||
|
sub['valid_key'] = True
|
||||||
|
settings.LICENSE = sub
|
||||||
|
return Response(sub)
|
||||||
|
|
||||||
|
return Response({"error": _("Error processing subscription metadata.")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
|
|
||||||
class ApiV2ConfigView(APIView):
|
class ApiV2ConfigView(APIView):
|
||||||
|
|
||||||
permission_classes = (IsAuthenticated,)
|
permission_classes = (IsAuthenticated,)
|
||||||
@@ -234,15 +285,11 @@ class ApiV2ConfigView(APIView):
|
|||||||
def get(self, request, format=None):
|
def get(self, request, format=None):
|
||||||
'''Return various sitewide configuration settings'''
|
'''Return various sitewide configuration settings'''
|
||||||
|
|
||||||
if request.user.is_superuser or request.user.is_system_auditor:
|
from awx.main.utils.common import get_licenser
|
||||||
license_data = get_license(show_key=True)
|
license_data = get_licenser().validate()
|
||||||
else:
|
|
||||||
license_data = get_license(show_key=False)
|
|
||||||
if not license_data.get('valid_key', False):
|
if not license_data.get('valid_key', False):
|
||||||
license_data = {}
|
license_data = {}
|
||||||
if license_data and 'features' in license_data and 'activity_streams' in license_data['features']:
|
|
||||||
# FIXME: Make the final setting value dependent on the feature?
|
|
||||||
license_data['features']['activity_streams'] &= settings.ACTIVITY_STREAM_ENABLED
|
|
||||||
|
|
||||||
pendo_state = settings.PENDO_TRACKING_STATE if settings.PENDO_TRACKING_STATE in ('off', 'anonymous', 'detailed') else 'off'
|
pendo_state = settings.PENDO_TRACKING_STATE if settings.PENDO_TRACKING_STATE in ('off', 'anonymous', 'detailed') else 'off'
|
||||||
|
|
||||||
@@ -281,9 +328,10 @@ class ApiV2ConfigView(APIView):
|
|||||||
|
|
||||||
return Response(data)
|
return Response(data)
|
||||||
|
|
||||||
|
|
||||||
def post(self, request):
|
def post(self, request):
|
||||||
if not isinstance(request.data, dict):
|
if not isinstance(request.data, dict):
|
||||||
return Response({"error": _("Invalid license data")}, status=status.HTTP_400_BAD_REQUEST)
|
return Response({"error": _("Invalid subscription data")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
if "eula_accepted" not in request.data:
|
if "eula_accepted" not in request.data:
|
||||||
return Response({"error": _("Missing 'eula_accepted' property")}, status=status.HTTP_400_BAD_REQUEST)
|
return Response({"error": _("Missing 'eula_accepted' property")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
try:
|
try:
|
||||||
@@ -300,25 +348,47 @@ class ApiV2ConfigView(APIView):
|
|||||||
logger.info(smart_text(u"Invalid JSON submitted for license."),
|
logger.info(smart_text(u"Invalid JSON submitted for license."),
|
||||||
extra=dict(actor=request.user.username))
|
extra=dict(actor=request.user.username))
|
||||||
return Response({"error": _("Invalid JSON")}, status=status.HTTP_400_BAD_REQUEST)
|
return Response({"error": _("Invalid JSON")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
try:
|
|
||||||
from awx.main.utils.common import get_licenser
|
from awx.main.utils.common import get_licenser
|
||||||
license_data = json.loads(data_actual)
|
license_data = json.loads(data_actual)
|
||||||
license_data_validated = get_licenser(**license_data).validate()
|
if 'license_key' in license_data:
|
||||||
except Exception:
|
return Response({"error": _('Legacy license submitted. A subscription manifest is now required.')}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
logger.warning(smart_text(u"Invalid license submitted."),
|
if 'manifest' in license_data:
|
||||||
extra=dict(actor=request.user.username))
|
try:
|
||||||
return Response({"error": _("Invalid License")}, status=status.HTTP_400_BAD_REQUEST)
|
json_actual = json.loads(base64.b64decode(license_data['manifest']))
|
||||||
|
if 'license_key' in json_actual:
|
||||||
|
return Response(
|
||||||
|
{"error": _('Legacy license submitted. A subscription manifest is now required.')},
|
||||||
|
status=status.HTTP_400_BAD_REQUEST
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
try:
|
||||||
|
license_data = validate_entitlement_manifest(license_data['manifest'])
|
||||||
|
except ValueError as e:
|
||||||
|
return Response({"error": str(e)}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
except Exception:
|
||||||
|
logger.exception('Invalid manifest submitted. {}')
|
||||||
|
return Response({"error": _('Invalid manifest submitted.')}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
|
try:
|
||||||
|
license_data_validated = get_licenser().license_from_manifest(license_data)
|
||||||
|
except Exception:
|
||||||
|
logger.warning(smart_text(u"Invalid subscription submitted."),
|
||||||
|
extra=dict(actor=request.user.username))
|
||||||
|
return Response({"error": _("Invalid License")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
else:
|
||||||
|
license_data_validated = get_licenser().validate()
|
||||||
|
|
||||||
# If the license is valid, write it to the database.
|
# If the license is valid, write it to the database.
|
||||||
if license_data_validated['valid_key']:
|
if license_data_validated['valid_key']:
|
||||||
settings.LICENSE = license_data
|
|
||||||
if not settings_registry.is_setting_read_only('TOWER_URL_BASE'):
|
if not settings_registry.is_setting_read_only('TOWER_URL_BASE'):
|
||||||
settings.TOWER_URL_BASE = "{}://{}".format(request.scheme, request.get_host())
|
settings.TOWER_URL_BASE = "{}://{}".format(request.scheme, request.get_host())
|
||||||
return Response(license_data_validated)
|
return Response(license_data_validated)
|
||||||
|
|
||||||
logger.warning(smart_text(u"Invalid license submitted."),
|
logger.warning(smart_text(u"Invalid subscription submitted."),
|
||||||
extra=dict(actor=request.user.username))
|
extra=dict(actor=request.user.username))
|
||||||
return Response({"error": _("Invalid license")}, status=status.HTTP_400_BAD_REQUEST)
|
return Response({"error": _("Invalid subscription")}, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
def delete(self, request):
|
def delete(self, request):
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -25,10 +25,12 @@ if MODE == 'production':
|
|||||||
try:
|
try:
|
||||||
fd = open("/var/lib/awx/.tower_version", "r")
|
fd = open("/var/lib/awx/.tower_version", "r")
|
||||||
if fd.read().strip() != tower_version:
|
if fd.read().strip() != tower_version:
|
||||||
raise Exception()
|
raise ValueError()
|
||||||
except Exception:
|
except FileNotFoundError:
|
||||||
|
pass
|
||||||
|
except ValueError as e:
|
||||||
logger.error("Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.")
|
logger.error("Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.")
|
||||||
raise Exception("Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.")
|
raise Exception("Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.") from e
|
||||||
|
|
||||||
|
|
||||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "awx.settings")
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "awx.settings")
|
||||||
|
|||||||
@@ -1,18 +1,14 @@
|
|||||||
# Copyright (c) 2016 Ansible, Inc.
|
# Copyright (c) 2016 Ansible, Inc.
|
||||||
# All Rights Reserved.
|
# All Rights Reserved.
|
||||||
|
|
||||||
|
|
||||||
__all__ = ['get_license']
|
__all__ = ['get_license']
|
||||||
|
|
||||||
|
|
||||||
def _get_validated_license_data():
|
def _get_validated_license_data():
|
||||||
from awx.main.utils.common import get_licenser
|
from awx.main.utils import get_licenser
|
||||||
return get_licenser().validate()
|
return get_licenser().validate()
|
||||||
|
|
||||||
|
|
||||||
def get_license(show_key=False):
|
def get_license():
|
||||||
"""Return a dictionary representing the active license on this Tower instance."""
|
"""Return a dictionary representing the active license on this Tower instance."""
|
||||||
license_data = _get_validated_license_data()
|
return _get_validated_license_data()
|
||||||
if not show_key:
|
|
||||||
license_data.pop('license_key', None)
|
|
||||||
return license_data
|
|
||||||
|
|||||||
26
awx/conf/migrations/0008_subscriptions.py
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
# Generated by Django 2.2.11 on 2020-08-04 15:19
|
||||||
|
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from django.db import migrations
|
||||||
|
|
||||||
|
from awx.conf.migrations._subscriptions import clear_old_license, prefill_rh_credentials
|
||||||
|
|
||||||
|
logger = logging.getLogger('awx.conf.migrations')
|
||||||
|
|
||||||
|
|
||||||
|
def _noop(apps, schema_editor):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('conf', '0007_v380_rename_more_settings'),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.RunPython(clear_old_license, _noop),
|
||||||
|
migrations.RunPython(prefill_rh_credentials, _noop)
|
||||||
|
]
|
||||||
34
awx/conf/migrations/_subscriptions.py
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import logging
|
||||||
|
from django.utils.timezone import now
|
||||||
|
from awx.main.utils.encryption import decrypt_field, encrypt_field
|
||||||
|
|
||||||
|
logger = logging.getLogger('awx.conf.settings')
|
||||||
|
|
||||||
|
__all__ = ['clear_old_license', 'prefill_rh_credentials']
|
||||||
|
|
||||||
|
|
||||||
|
def clear_old_license(apps, schema_editor):
|
||||||
|
Setting = apps.get_model('conf', 'Setting')
|
||||||
|
Setting.objects.filter(key='LICENSE').delete()
|
||||||
|
|
||||||
|
|
||||||
|
def _migrate_setting(apps, old_key, new_key, encrypted=False):
|
||||||
|
Setting = apps.get_model('conf', 'Setting')
|
||||||
|
if not Setting.objects.filter(key=old_key).exists():
|
||||||
|
return
|
||||||
|
new_setting = Setting.objects.create(key=new_key,
|
||||||
|
created=now(),
|
||||||
|
modified=now()
|
||||||
|
)
|
||||||
|
if encrypted:
|
||||||
|
new_setting.value = decrypt_field(Setting.objects.filter(key=old_key).first(), 'value')
|
||||||
|
new_setting.value = encrypt_field(new_setting, 'value')
|
||||||
|
else:
|
||||||
|
new_setting.value = getattr(Setting.objects.filter(key=old_key).first(), 'value')
|
||||||
|
new_setting.save()
|
||||||
|
|
||||||
|
|
||||||
|
def prefill_rh_credentials(apps, schema_editor):
|
||||||
|
_migrate_setting(apps, 'REDHAT_USERNAME', 'SUBSCRIPTIONS_USERNAME', encrypted=False)
|
||||||
|
_migrate_setting(apps, 'REDHAT_PASSWORD', 'SUBSCRIPTIONS_PASSWORD', encrypted=True)
|
||||||
@@ -78,14 +78,6 @@ class Setting(CreatedModifiedModel):
|
|||||||
def get_cache_id_key(self, key):
|
def get_cache_id_key(self, key):
|
||||||
return '{}_ID'.format(key)
|
return '{}_ID'.format(key)
|
||||||
|
|
||||||
def display_value(self):
|
|
||||||
if self.key == 'LICENSE' and 'license_key' in self.value:
|
|
||||||
# don't log the license key in activity stream
|
|
||||||
value = self.value.copy()
|
|
||||||
value['license_key'] = '********'
|
|
||||||
return value
|
|
||||||
return self.value
|
|
||||||
|
|
||||||
|
|
||||||
import awx.conf.signals # noqa
|
import awx.conf.signals # noqa
|
||||||
|
|
||||||
|
|||||||
@@ -33,9 +33,9 @@ data _since_ the last report date - i.e., new data in the last 24 hours)
|
|||||||
'''
|
'''
|
||||||
|
|
||||||
|
|
||||||
@register('config', '1.1', description=_('General platform configuration.'))
|
@register('config', '1.2', description=_('General platform configuration.'))
|
||||||
def config(since, **kwargs):
|
def config(since, **kwargs):
|
||||||
license_info = get_license(show_key=False)
|
license_info = get_license()
|
||||||
install_type = 'traditional'
|
install_type = 'traditional'
|
||||||
if os.environ.get('container') == 'oci':
|
if os.environ.get('container') == 'oci':
|
||||||
install_type = 'openshift'
|
install_type = 'openshift'
|
||||||
@@ -194,7 +194,6 @@ def instance_info(since, include_hostnames=False, **kwargs):
|
|||||||
return info
|
return info
|
||||||
|
|
||||||
|
|
||||||
@register('job_counts', '1.0', description=_('Counts of jobs by status'))
|
|
||||||
def job_counts(since, **kwargs):
|
def job_counts(since, **kwargs):
|
||||||
counts = {}
|
counts = {}
|
||||||
counts['total_jobs'] = models.UnifiedJob.objects.exclude(launch_type='sync').count()
|
counts['total_jobs'] = models.UnifiedJob.objects.exclude(launch_type='sync').count()
|
||||||
@@ -204,7 +203,6 @@ def job_counts(since, **kwargs):
|
|||||||
return counts
|
return counts
|
||||||
|
|
||||||
|
|
||||||
@register('job_instance_counts', '1.0', description=_('Counts of jobs by execution node'))
|
|
||||||
def job_instance_counts(since, **kwargs):
|
def job_instance_counts(since, **kwargs):
|
||||||
counts = {}
|
counts = {}
|
||||||
job_types = models.UnifiedJob.objects.exclude(launch_type='sync').values_list(
|
job_types = models.UnifiedJob.objects.exclude(launch_type='sync').values_list(
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ logger = logging.getLogger('awx.main.analytics')
|
|||||||
|
|
||||||
def _valid_license():
|
def _valid_license():
|
||||||
try:
|
try:
|
||||||
if get_license(show_key=False).get('license_type', 'UNLICENSED') == 'open':
|
if get_license().get('license_type', 'UNLICENSED') == 'open':
|
||||||
return False
|
return False
|
||||||
access_registry[Job](None).check_license()
|
access_registry[Job](None).check_license()
|
||||||
except PermissionDenied:
|
except PermissionDenied:
|
||||||
|
|||||||
@@ -12,7 +12,7 @@ from prometheus_client import (
|
|||||||
from awx.conf.license import get_license
|
from awx.conf.license import get_license
|
||||||
from awx.main.utils import (get_awx_version, get_ansible_version)
|
from awx.main.utils import (get_awx_version, get_ansible_version)
|
||||||
from awx.main.analytics.collectors import (
|
from awx.main.analytics.collectors import (
|
||||||
counts,
|
counts,
|
||||||
instance_info,
|
instance_info,
|
||||||
job_instance_counts,
|
job_instance_counts,
|
||||||
job_counts,
|
job_counts,
|
||||||
@@ -54,7 +54,7 @@ LICENSE_INSTANCE_FREE = Gauge('awx_license_instance_free', 'Number of remaining
|
|||||||
|
|
||||||
|
|
||||||
def metrics():
|
def metrics():
|
||||||
license_info = get_license(show_key=False)
|
license_info = get_license()
|
||||||
SYSTEM_INFO.info({
|
SYSTEM_INFO.info({
|
||||||
'install_uuid': settings.INSTALL_UUID,
|
'install_uuid': settings.INSTALL_UUID,
|
||||||
'insights_analytics': str(settings.INSIGHTS_TRACKING_STATE),
|
'insights_analytics': str(settings.INSIGHTS_TRACKING_STATE),
|
||||||
|
|||||||
@@ -1,7 +1,5 @@
|
|||||||
# Python
|
# Python
|
||||||
import json
|
|
||||||
import logging
|
import logging
|
||||||
import os
|
|
||||||
|
|
||||||
# Django
|
# Django
|
||||||
from django.utils.translation import ugettext_lazy as _
|
from django.utils.translation import ugettext_lazy as _
|
||||||
@@ -13,6 +11,7 @@ from rest_framework.fields import FloatField
|
|||||||
# Tower
|
# Tower
|
||||||
from awx.conf import fields, register, register_validate
|
from awx.conf import fields, register, register_validate
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger('awx.main.conf')
|
logger = logging.getLogger('awx.main.conf')
|
||||||
|
|
||||||
register(
|
register(
|
||||||
@@ -92,22 +91,10 @@ register(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def _load_default_license_from_file():
|
|
||||||
try:
|
|
||||||
license_file = os.environ.get('AWX_LICENSE_FILE', '/etc/tower/license')
|
|
||||||
if os.path.exists(license_file):
|
|
||||||
license_data = json.load(open(license_file))
|
|
||||||
logger.debug('Read license data from "%s".', license_file)
|
|
||||||
return license_data
|
|
||||||
except Exception:
|
|
||||||
logger.warning('Could not read license from "%s".', license_file, exc_info=True)
|
|
||||||
return {}
|
|
||||||
|
|
||||||
|
|
||||||
register(
|
register(
|
||||||
'LICENSE',
|
'LICENSE',
|
||||||
field_class=fields.DictField,
|
field_class=fields.DictField,
|
||||||
default=_load_default_license_from_file,
|
default=lambda: {},
|
||||||
label=_('License'),
|
label=_('License'),
|
||||||
help_text=_('The license controls which features and functionality are '
|
help_text=_('The license controls which features and functionality are '
|
||||||
'enabled. Use /api/v2/config/ to update or change '
|
'enabled. Use /api/v2/config/ to update or change '
|
||||||
@@ -124,7 +111,7 @@ register(
|
|||||||
encrypted=False,
|
encrypted=False,
|
||||||
read_only=False,
|
read_only=False,
|
||||||
label=_('Red Hat customer username'),
|
label=_('Red Hat customer username'),
|
||||||
help_text=_('This username is used to retrieve license information and to send Automation Analytics'), # noqa
|
help_text=_('This username is used to send data to Automation Analytics'),
|
||||||
category=_('System'),
|
category=_('System'),
|
||||||
category_slug='system',
|
category_slug='system',
|
||||||
)
|
)
|
||||||
@@ -137,7 +124,33 @@ register(
|
|||||||
encrypted=True,
|
encrypted=True,
|
||||||
read_only=False,
|
read_only=False,
|
||||||
label=_('Red Hat customer password'),
|
label=_('Red Hat customer password'),
|
||||||
help_text=_('This password is used to retrieve license information and to send Automation Analytics'), # noqa
|
help_text=_('This password is used to send data to Automation Analytics'),
|
||||||
|
category=_('System'),
|
||||||
|
category_slug='system',
|
||||||
|
)
|
||||||
|
|
||||||
|
register(
|
||||||
|
'SUBSCRIPTIONS_USERNAME',
|
||||||
|
field_class=fields.CharField,
|
||||||
|
default='',
|
||||||
|
allow_blank=True,
|
||||||
|
encrypted=False,
|
||||||
|
read_only=False,
|
||||||
|
label=_('Red Hat or Satellite username'),
|
||||||
|
help_text=_('This username is used to retrieve subscription and content information'), # noqa
|
||||||
|
category=_('System'),
|
||||||
|
category_slug='system',
|
||||||
|
)
|
||||||
|
|
||||||
|
register(
|
||||||
|
'SUBSCRIPTIONS_PASSWORD',
|
||||||
|
field_class=fields.CharField,
|
||||||
|
default='',
|
||||||
|
allow_blank=True,
|
||||||
|
encrypted=True,
|
||||||
|
read_only=False,
|
||||||
|
label=_('Red Hat or Satellite password'),
|
||||||
|
help_text=_('This password is used to retrieve subscription and content information'), # noqa
|
||||||
category=_('System'),
|
category=_('System'),
|
||||||
category_slug='system',
|
category_slug='system',
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -1,10 +1,7 @@
|
|||||||
import cProfile
|
|
||||||
import json
|
import json
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import pstats
|
|
||||||
import signal
|
import signal
|
||||||
import tempfile
|
|
||||||
import time
|
import time
|
||||||
import traceback
|
import traceback
|
||||||
|
|
||||||
@@ -23,6 +20,7 @@ from awx.main.models import (JobEvent, AdHocCommandEvent, ProjectUpdateEvent,
|
|||||||
Job)
|
Job)
|
||||||
from awx.main.tasks import handle_success_and_failure_notifications
|
from awx.main.tasks import handle_success_and_failure_notifications
|
||||||
from awx.main.models.events import emit_event_detail
|
from awx.main.models.events import emit_event_detail
|
||||||
|
from awx.main.utils.profiling import AWXProfiler
|
||||||
|
|
||||||
from .base import BaseWorker
|
from .base import BaseWorker
|
||||||
|
|
||||||
@@ -48,6 +46,7 @@ class CallbackBrokerWorker(BaseWorker):
|
|||||||
self.buff = {}
|
self.buff = {}
|
||||||
self.pid = os.getpid()
|
self.pid = os.getpid()
|
||||||
self.redis = redis.Redis.from_url(settings.BROKER_URL)
|
self.redis = redis.Redis.from_url(settings.BROKER_URL)
|
||||||
|
self.prof = AWXProfiler("CallbackBrokerWorker")
|
||||||
for key in self.redis.keys('awx_callback_receiver_statistics_*'):
|
for key in self.redis.keys('awx_callback_receiver_statistics_*'):
|
||||||
self.redis.delete(key)
|
self.redis.delete(key)
|
||||||
|
|
||||||
@@ -87,19 +86,12 @@ class CallbackBrokerWorker(BaseWorker):
|
|||||||
)
|
)
|
||||||
|
|
||||||
def toggle_profiling(self, *args):
|
def toggle_profiling(self, *args):
|
||||||
if self.prof:
|
if not self.prof.is_started():
|
||||||
self.prof.disable()
|
self.prof.start()
|
||||||
filename = f'callback-{self.pid}.pstats'
|
|
||||||
filepath = os.path.join(tempfile.gettempdir(), filename)
|
|
||||||
with open(filepath, 'w') as f:
|
|
||||||
pstats.Stats(self.prof, stream=f).sort_stats('cumulative').print_stats()
|
|
||||||
pstats.Stats(self.prof).dump_stats(filepath + '.raw')
|
|
||||||
self.prof = False
|
|
||||||
logger.error(f'profiling is disabled, wrote {filepath}')
|
|
||||||
else:
|
|
||||||
self.prof = cProfile.Profile()
|
|
||||||
self.prof.enable()
|
|
||||||
logger.error('profiling is enabled')
|
logger.error('profiling is enabled')
|
||||||
|
else:
|
||||||
|
filepath = self.prof.stop()
|
||||||
|
logger.error(f'profiling is disabled, wrote {filepath}')
|
||||||
|
|
||||||
def work_loop(self, *args, **kw):
|
def work_loop(self, *args, **kw):
|
||||||
if settings.AWX_CALLBACK_PROFILE:
|
if settings.AWX_CALLBACK_PROFILE:
|
||||||
|
|||||||
@@ -18,7 +18,5 @@ class Command(BaseCommand):
|
|||||||
super(Command, self).__init__()
|
super(Command, self).__init__()
|
||||||
license = get_licenser().validate()
|
license = get_licenser().validate()
|
||||||
if options.get('data'):
|
if options.get('data'):
|
||||||
if license.get('license_key', '') != 'UNLICENSED':
|
|
||||||
license['license_key'] = '********'
|
|
||||||
return json.dumps(license)
|
return json.dumps(license)
|
||||||
return license.get('license_type', 'none')
|
return license.get('license_type', 'none')
|
||||||
|
|||||||
@@ -8,5 +8,7 @@ class Command(MakeMigrations):
|
|||||||
def execute(self, *args, **options):
|
def execute(self, *args, **options):
|
||||||
settings = connections['default'].settings_dict.copy()
|
settings = connections['default'].settings_dict.copy()
|
||||||
settings['ENGINE'] = 'sqlite3'
|
settings['ENGINE'] = 'sqlite3'
|
||||||
|
if 'application_name' in settings['OPTIONS']:
|
||||||
|
del settings['OPTIONS']['application_name']
|
||||||
connections['default'] = DatabaseWrapper(settings)
|
connections['default'] = DatabaseWrapper(settings)
|
||||||
return MakeMigrations().execute(*args, **options)
|
return MakeMigrations().execute(*args, **options)
|
||||||
|
|||||||
117
awx/main/management/commands/graph_jobs.py
Normal file
@@ -0,0 +1,117 @@
|
|||||||
|
# Python
|
||||||
|
import asciichartpy as chart
|
||||||
|
import collections
|
||||||
|
import time
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Django
|
||||||
|
from django.db.models import Count
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
# AWX
|
||||||
|
from awx.main.models import (
|
||||||
|
Job,
|
||||||
|
Instance
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
DEFAULT_WIDTH = 100
|
||||||
|
DEFAULT_HEIGHT = 30
|
||||||
|
|
||||||
|
|
||||||
|
def chart_color_lookup(color_str):
|
||||||
|
return getattr(chart, color_str)
|
||||||
|
|
||||||
|
|
||||||
|
def clear_screen():
|
||||||
|
print(chr(27) + "[2J")
|
||||||
|
|
||||||
|
|
||||||
|
class JobStatus():
|
||||||
|
def __init__(self, status, color, width):
|
||||||
|
self.status = status
|
||||||
|
self.color = color
|
||||||
|
self.color_code = chart_color_lookup(color)
|
||||||
|
self.x = collections.deque(maxlen=width)
|
||||||
|
self.y = collections.deque(maxlen=width)
|
||||||
|
|
||||||
|
def tick(self, x, y):
|
||||||
|
self.x.append(x)
|
||||||
|
self.y.append(y)
|
||||||
|
|
||||||
|
|
||||||
|
class JobStatusController:
|
||||||
|
RESET = chart_color_lookup('reset')
|
||||||
|
|
||||||
|
def __init__(self, width):
|
||||||
|
self.plots = [
|
||||||
|
JobStatus('pending', 'red', width),
|
||||||
|
JobStatus('waiting', 'blue', width),
|
||||||
|
JobStatus('running', 'green', width)
|
||||||
|
]
|
||||||
|
self.ts_start = int(time.time())
|
||||||
|
|
||||||
|
def tick(self):
|
||||||
|
ts = int(time.time()) - self.ts_start
|
||||||
|
q = Job.objects.filter(status__in=['pending','waiting','running']).values_list('status').order_by().annotate(Count('status'))
|
||||||
|
status_count = dict(pending=0, waiting=0, running=0)
|
||||||
|
for status, count in q:
|
||||||
|
status_count[status] = count
|
||||||
|
|
||||||
|
for p in self.plots:
|
||||||
|
p.tick(ts, status_count[p.status])
|
||||||
|
|
||||||
|
def series(self):
|
||||||
|
return [list(p.y) for p in self.plots]
|
||||||
|
|
||||||
|
def generate_status(self):
|
||||||
|
line = ""
|
||||||
|
lines = []
|
||||||
|
for p in self.plots:
|
||||||
|
lines.append(f'{p.color_code}{p.status} {p.y[-1]}{self.RESET}')
|
||||||
|
|
||||||
|
line += ", ".join(lines) + '\n'
|
||||||
|
|
||||||
|
width = 5
|
||||||
|
time_running = int(time.time()) - self.ts_start
|
||||||
|
instances = Instance.objects.all().order_by('hostname')
|
||||||
|
line += "Capacity: " + ", ".join([f"{instance.capacity:{width}}" for instance in instances]) + '\n'
|
||||||
|
line += "Remaining: " + ", ".join([f"{instance.remaining_capacity:{width}}" for instance in instances]) + '\n'
|
||||||
|
line += f"Seconds running: {time_running}" + '\n'
|
||||||
|
|
||||||
|
return line
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Plot pending, waiting, running jobs over time on the terminal"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument('--refresh', dest='refresh', type=float, default=1.0,
|
||||||
|
help='Time between refreshes of the graph and data in seconds (defaults to 1.0)')
|
||||||
|
parser.add_argument('--width', dest='width', type=int, default=DEFAULT_WIDTH,
|
||||||
|
help=f'Width of the graph (defaults to {DEFAULT_WIDTH})')
|
||||||
|
parser.add_argument('--height', dest='height', type=int, default=DEFAULT_HEIGHT,
|
||||||
|
help=f'Height of the graph (defaults to {DEFAULT_HEIGHT})')
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
refresh_seconds = options['refresh']
|
||||||
|
width = options['width']
|
||||||
|
height = options['height']
|
||||||
|
|
||||||
|
jctl = JobStatusController(width)
|
||||||
|
|
||||||
|
conf = {
|
||||||
|
'colors': [chart_color_lookup(p.color) for p in jctl.plots],
|
||||||
|
'height': height,
|
||||||
|
}
|
||||||
|
|
||||||
|
while True:
|
||||||
|
jctl.tick()
|
||||||
|
|
||||||
|
draw = chart.plot(jctl.series(), conf)
|
||||||
|
status_line = jctl.generate_status()
|
||||||
|
clear_screen()
|
||||||
|
print(draw)
|
||||||
|
sys.stdout.write(status_line)
|
||||||
|
time.sleep(refresh_seconds)
|
||||||
|
|
||||||
@@ -903,7 +903,7 @@ class Command(BaseCommand):
|
|||||||
def check_license(self):
|
def check_license(self):
|
||||||
license_info = get_licenser().validate()
|
license_info = get_licenser().validate()
|
||||||
local_license_type = license_info.get('license_type', 'UNLICENSED')
|
local_license_type = license_info.get('license_type', 'UNLICENSED')
|
||||||
if license_info.get('license_key', 'UNLICENSED') == 'UNLICENSED':
|
if local_license_type == 'UNLICENSED':
|
||||||
logger.error(LICENSE_NON_EXISTANT_MESSAGE)
|
logger.error(LICENSE_NON_EXISTANT_MESSAGE)
|
||||||
raise CommandError('No license found!')
|
raise CommandError('No license found!')
|
||||||
elif local_license_type == 'open':
|
elif local_license_type == 'open':
|
||||||
|
|||||||
@@ -19,7 +19,9 @@ class Command(BaseCommand):
|
|||||||
profile_sql.delay(
|
profile_sql.delay(
|
||||||
threshold=options['threshold'], minutes=options['minutes']
|
threshold=options['threshold'], minutes=options['minutes']
|
||||||
)
|
)
|
||||||
print(f"Logging initiated with a threshold of {options['threshold']} second(s) and a duration of"
|
if options['threshold'] > 0:
|
||||||
f" {options['minutes']} minute(s), any queries that meet criteria can"
|
print(f"SQL profiling initiated with a threshold of {options['threshold']} second(s) and a"
|
||||||
f" be found in /var/log/tower/profile/."
|
f" duration of {options['minutes']} minute(s), any queries that meet criteria can"
|
||||||
)
|
f" be found in /var/log/tower/profile/.")
|
||||||
|
else:
|
||||||
|
print("SQL profiling disabled.")
|
||||||
|
|||||||
@@ -1,13 +1,9 @@
|
|||||||
# Copyright (c) 2015 Ansible, Inc.
|
# Copyright (c) 2015 Ansible, Inc.
|
||||||
# All Rights Reserved.
|
# All Rights Reserved.
|
||||||
|
|
||||||
import uuid
|
|
||||||
import logging
|
import logging
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
import cProfile
|
|
||||||
import pstats
|
|
||||||
import os
|
|
||||||
import urllib.parse
|
import urllib.parse
|
||||||
|
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
@@ -22,6 +18,7 @@ from django.urls import reverse, resolve
|
|||||||
|
|
||||||
from awx.main.utils.named_url_graph import generate_graph, GraphNode
|
from awx.main.utils.named_url_graph import generate_graph, GraphNode
|
||||||
from awx.conf import fields, register
|
from awx.conf import fields, register
|
||||||
|
from awx.main.utils.profiling import AWXProfiler
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger('awx.main.middleware')
|
logger = logging.getLogger('awx.main.middleware')
|
||||||
@@ -32,11 +29,14 @@ class TimingMiddleware(threading.local, MiddlewareMixin):
|
|||||||
|
|
||||||
dest = '/var/log/tower/profile'
|
dest = '/var/log/tower/profile'
|
||||||
|
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
self.prof = AWXProfiler("TimingMiddleware")
|
||||||
|
|
||||||
def process_request(self, request):
|
def process_request(self, request):
|
||||||
self.start_time = time.time()
|
self.start_time = time.time()
|
||||||
if settings.AWX_REQUEST_PROFILE:
|
if settings.AWX_REQUEST_PROFILE:
|
||||||
self.prof = cProfile.Profile()
|
self.prof.start()
|
||||||
self.prof.enable()
|
|
||||||
|
|
||||||
def process_response(self, request, response):
|
def process_response(self, request, response):
|
||||||
if not hasattr(self, 'start_time'): # some tools may not invoke process_request
|
if not hasattr(self, 'start_time'): # some tools may not invoke process_request
|
||||||
@@ -44,33 +44,10 @@ class TimingMiddleware(threading.local, MiddlewareMixin):
|
|||||||
total_time = time.time() - self.start_time
|
total_time = time.time() - self.start_time
|
||||||
response['X-API-Total-Time'] = '%0.3fs' % total_time
|
response['X-API-Total-Time'] = '%0.3fs' % total_time
|
||||||
if settings.AWX_REQUEST_PROFILE:
|
if settings.AWX_REQUEST_PROFILE:
|
||||||
self.prof.disable()
|
response['X-API-Profile-File'] = self.prof.stop()
|
||||||
cprofile_file = self.save_profile_file(request)
|
|
||||||
response['cprofile_file'] = cprofile_file
|
|
||||||
perf_logger.info('api response times', extra=dict(python_objects=dict(request=request, response=response)))
|
perf_logger.info('api response times', extra=dict(python_objects=dict(request=request, response=response)))
|
||||||
return response
|
return response
|
||||||
|
|
||||||
def save_profile_file(self, request):
|
|
||||||
if not os.path.isdir(self.dest):
|
|
||||||
os.makedirs(self.dest)
|
|
||||||
filename = '%.3fs-%s.pstats' % (pstats.Stats(self.prof).total_tt, uuid.uuid4())
|
|
||||||
filepath = os.path.join(self.dest, filename)
|
|
||||||
with open(filepath, 'w') as f:
|
|
||||||
f.write('%s %s\n' % (request.method, request.get_full_path()))
|
|
||||||
pstats.Stats(self.prof, stream=f).sort_stats('cumulative').print_stats()
|
|
||||||
|
|
||||||
if settings.AWX_REQUEST_PROFILE_WITH_DOT:
|
|
||||||
from gprof2dot import main as generate_dot
|
|
||||||
raw = os.path.join(self.dest, filename) + '.raw'
|
|
||||||
pstats.Stats(self.prof).dump_stats(raw)
|
|
||||||
generate_dot([
|
|
||||||
'-n', '2.5', '-f', 'pstats', '-o',
|
|
||||||
os.path.join( self.dest, filename).replace('.pstats', '.dot'),
|
|
||||||
raw
|
|
||||||
])
|
|
||||||
os.remove(raw)
|
|
||||||
return filepath
|
|
||||||
|
|
||||||
|
|
||||||
class SessionTimeoutMiddleware(MiddlewareMixin):
|
class SessionTimeoutMiddleware(MiddlewareMixin):
|
||||||
"""
|
"""
|
||||||
|
|||||||
@@ -1,11 +1,7 @@
|
|||||||
# Generated by Django 2.2.11 on 2020-05-01 13:25
|
# Generated by Django 2.2.11 on 2020-05-01 13:25
|
||||||
|
|
||||||
from django.db import migrations, models
|
from django.db import migrations, models
|
||||||
from awx.main.migrations._inventory_source import create_scm_script_substitute
|
from awx.main.migrations._inventory_source import delete_cloudforms_inv_source
|
||||||
|
|
||||||
|
|
||||||
def convert_cloudforms_to_scm(apps, schema_editor):
|
|
||||||
create_scm_script_substitute(apps, 'cloudforms')
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
class Migration(migrations.Migration):
|
||||||
@@ -15,7 +11,7 @@ class Migration(migrations.Migration):
|
|||||||
]
|
]
|
||||||
|
|
||||||
operations = [
|
operations = [
|
||||||
migrations.RunPython(convert_cloudforms_to_scm),
|
migrations.RunPython(delete_cloudforms_inv_source),
|
||||||
migrations.AlterField(
|
migrations.AlterField(
|
||||||
model_name='inventorysource',
|
model_name='inventorysource',
|
||||||
name='source',
|
name='source',
|
||||||
|
|||||||
@@ -0,0 +1,13 @@
|
|||||||
|
from django.db import migrations
|
||||||
|
from awx.main.migrations._inventory_source import delete_cloudforms_inv_source
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('main', '0121_delete_toweranalyticsstate'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.RunPython(delete_cloudforms_inv_source),
|
||||||
|
]
|
||||||
@@ -5,6 +5,7 @@ from uuid import uuid4
|
|||||||
from django.utils.encoding import smart_text
|
from django.utils.encoding import smart_text
|
||||||
from django.utils.timezone import now
|
from django.utils.timezone import now
|
||||||
|
|
||||||
|
from awx.main.utils.common import set_current_apps
|
||||||
from awx.main.utils.common import parse_yaml_or_json
|
from awx.main.utils.common import parse_yaml_or_json
|
||||||
|
|
||||||
logger = logging.getLogger('awx.main.migrations')
|
logger = logging.getLogger('awx.main.migrations')
|
||||||
@@ -91,43 +92,14 @@ def back_out_new_instance_id(apps, source, new_id):
|
|||||||
))
|
))
|
||||||
|
|
||||||
|
|
||||||
def create_scm_script_substitute(apps, source):
|
def delete_cloudforms_inv_source(apps, schema_editor):
|
||||||
"""Only applies for cloudforms in practice, but written generally.
|
set_current_apps(apps)
|
||||||
Given a source type, this will replace all inventory sources of that type
|
|
||||||
with SCM inventory sources that source the script from Ansible core
|
|
||||||
"""
|
|
||||||
# the revision in the Ansible 2.9 stable branch this project will start out as
|
|
||||||
# it can still be updated manually later (but staying within 2.9 branch), if desired
|
|
||||||
ansible_rev = '6f83b9aff42331e15c55a171de0a8b001208c18c'
|
|
||||||
InventorySource = apps.get_model('main', 'InventorySource')
|
InventorySource = apps.get_model('main', 'InventorySource')
|
||||||
ContentType = apps.get_model('contenttypes', 'ContentType')
|
InventoryUpdate = apps.get_model('main', 'InventoryUpdate')
|
||||||
Project = apps.get_model('main', 'Project')
|
CredentialType = apps.get_model('main', 'CredentialType')
|
||||||
if not InventorySource.objects.filter(source=source).exists():
|
InventoryUpdate.objects.filter(inventory_source__source='cloudforms').delete()
|
||||||
logger.debug('No sources of type {} to migrate'.format(source))
|
InventorySource.objects.filter(source='cloudforms').delete()
|
||||||
return
|
ct = CredentialType.objects.filter(namespace='cloudforms').first()
|
||||||
proj_name = 'Replacement project for {} type sources - {}'.format(source, uuid4())
|
|
||||||
right_now = now()
|
|
||||||
project = Project.objects.create(
|
|
||||||
name=proj_name,
|
|
||||||
created=right_now,
|
|
||||||
modified=right_now,
|
|
||||||
description='Created by migration',
|
|
||||||
polymorphic_ctype=ContentType.objects.get(model='project'),
|
|
||||||
# project-specific fields
|
|
||||||
scm_type='git',
|
|
||||||
scm_url='https://github.com/ansible/ansible.git',
|
|
||||||
scm_branch='stable-2.9',
|
|
||||||
scm_revision=ansible_rev
|
|
||||||
)
|
|
||||||
ct = 0
|
|
||||||
for inv_src in InventorySource.objects.filter(source=source).iterator():
|
|
||||||
inv_src.source = 'scm'
|
|
||||||
inv_src.source_project = project
|
|
||||||
inv_src.source_path = 'contrib/inventory/{}.py'.format(source)
|
|
||||||
inv_src.scm_last_revision = ansible_rev
|
|
||||||
inv_src.save(update_fields=['source', 'source_project', 'source_path', 'scm_last_revision'])
|
|
||||||
logger.debug('Changed inventory source {} to scm type'.format(inv_src.pk))
|
|
||||||
ct += 1
|
|
||||||
if ct:
|
if ct:
|
||||||
logger.info('Changed total of {} inventory sources from {} type to scm'.format(ct, source))
|
ct.credentials.all().delete()
|
||||||
|
ct.delete()
|
||||||
|
|||||||
@@ -881,33 +881,6 @@ ManagedCredentialType(
|
|||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
ManagedCredentialType(
|
|
||||||
namespace='cloudforms',
|
|
||||||
kind='cloud',
|
|
||||||
name=ugettext_noop('Red Hat CloudForms'),
|
|
||||||
managed_by_tower=True,
|
|
||||||
inputs={
|
|
||||||
'fields': [{
|
|
||||||
'id': 'host',
|
|
||||||
'label': ugettext_noop('CloudForms URL'),
|
|
||||||
'type': 'string',
|
|
||||||
'help_text': ugettext_noop('Enter the URL for the virtual machine that '
|
|
||||||
'corresponds to your CloudForms instance. '
|
|
||||||
'For example, https://cloudforms.example.org')
|
|
||||||
}, {
|
|
||||||
'id': 'username',
|
|
||||||
'label': ugettext_noop('Username'),
|
|
||||||
'type': 'string'
|
|
||||||
}, {
|
|
||||||
'id': 'password',
|
|
||||||
'label': ugettext_noop('Password'),
|
|
||||||
'type': 'string',
|
|
||||||
'secret': True,
|
|
||||||
}],
|
|
||||||
'required': ['host', 'username', 'password'],
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
ManagedCredentialType(
|
ManagedCredentialType(
|
||||||
namespace='gce',
|
namespace='gce',
|
||||||
kind='cloud',
|
kind='cloud',
|
||||||
|
|||||||
@@ -261,18 +261,20 @@ class InstanceGroup(HasPolicyEditsMixin, BaseModel, RelatedJobsMixin):
|
|||||||
app_label = 'main'
|
app_label = 'main'
|
||||||
|
|
||||||
|
|
||||||
def fit_task_to_most_remaining_capacity_instance(self, task):
|
@staticmethod
|
||||||
|
def fit_task_to_most_remaining_capacity_instance(task, instances):
|
||||||
instance_most_capacity = None
|
instance_most_capacity = None
|
||||||
for i in self.instances.filter(capacity__gt=0, enabled=True).order_by('hostname'):
|
for i in instances:
|
||||||
if i.remaining_capacity >= task.task_impact and \
|
if i.remaining_capacity >= task.task_impact and \
|
||||||
(instance_most_capacity is None or
|
(instance_most_capacity is None or
|
||||||
i.remaining_capacity > instance_most_capacity.remaining_capacity):
|
i.remaining_capacity > instance_most_capacity.remaining_capacity):
|
||||||
instance_most_capacity = i
|
instance_most_capacity = i
|
||||||
return instance_most_capacity
|
return instance_most_capacity
|
||||||
|
|
||||||
def find_largest_idle_instance(self):
|
@staticmethod
|
||||||
|
def find_largest_idle_instance(instances):
|
||||||
largest_instance = None
|
largest_instance = None
|
||||||
for i in self.instances.filter(capacity__gt=0, enabled=True).order_by('hostname'):
|
for i in instances:
|
||||||
if i.jobs_running == 0:
|
if i.jobs_running == 0:
|
||||||
if largest_instance is None:
|
if largest_instance is None:
|
||||||
largest_instance = i
|
largest_instance = i
|
||||||
|
|||||||
@@ -798,6 +798,10 @@ class Job(UnifiedJob, JobOptions, SurveyJobMixin, JobNotificationMixin, TaskMana
|
|||||||
if self.project:
|
if self.project:
|
||||||
for name in ('awx', 'tower'):
|
for name in ('awx', 'tower'):
|
||||||
r['{}_project_revision'.format(name)] = self.project.scm_revision
|
r['{}_project_revision'.format(name)] = self.project.scm_revision
|
||||||
|
r['{}_project_scm_branch'.format(name)] = self.project.scm_branch
|
||||||
|
if self.scm_branch:
|
||||||
|
for name in ('awx', 'tower'):
|
||||||
|
r['{}_job_scm_branch'.format(name)] = self.scm_branch
|
||||||
if self.job_template:
|
if self.job_template:
|
||||||
for name in ('awx', 'tower'):
|
for name in ('awx', 'tower'):
|
||||||
r['{}_job_template_id'.format(name)] = self.job_template.pk
|
r['{}_job_template_id'.format(name)] = self.job_template.pk
|
||||||
|
|||||||
@@ -873,7 +873,13 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
|
|||||||
|
|
||||||
# If status changed, update the parent instance.
|
# If status changed, update the parent instance.
|
||||||
if self.status != status_before:
|
if self.status != status_before:
|
||||||
self._update_parent_instance()
|
# Update parent outside of the transaction for Job w/ allow_simultaneous=True
|
||||||
|
# This dodges lock contention at the expense of the foreign key not being
|
||||||
|
# completely correct.
|
||||||
|
if getattr(self, 'allow_simultaneous', False):
|
||||||
|
connection.on_commit(self._update_parent_instance)
|
||||||
|
else:
|
||||||
|
self._update_parent_instance()
|
||||||
|
|
||||||
# Done.
|
# Done.
|
||||||
return result
|
return result
|
||||||
|
|||||||
@@ -674,7 +674,7 @@ class WorkflowJob(UnifiedJob, WorkflowJobOptions, SurveyJobMixin, JobNotificatio
|
|||||||
return self.status == 'running'
|
return self.status == 'running'
|
||||||
|
|
||||||
|
|
||||||
class WorkflowApprovalTemplate(UnifiedJobTemplate):
|
class WorkflowApprovalTemplate(UnifiedJobTemplate, RelatedJobsMixin):
|
||||||
|
|
||||||
FIELDS_TO_PRESERVE_AT_COPY = ['description', 'timeout',]
|
FIELDS_TO_PRESERVE_AT_COPY = ['description', 'timeout',]
|
||||||
|
|
||||||
@@ -702,6 +702,12 @@ class WorkflowApprovalTemplate(UnifiedJobTemplate):
|
|||||||
def workflow_job_template(self):
|
def workflow_job_template(self):
|
||||||
return self.workflowjobtemplatenodes.first().workflow_job_template
|
return self.workflowjobtemplatenodes.first().workflow_job_template
|
||||||
|
|
||||||
|
'''
|
||||||
|
RelatedJobsMixin
|
||||||
|
'''
|
||||||
|
def _get_related_jobs(self):
|
||||||
|
return UnifiedJob.objects.filter(unified_job_template=self)
|
||||||
|
|
||||||
|
|
||||||
class WorkflowApproval(UnifiedJob, JobNotificationMixin):
|
class WorkflowApproval(UnifiedJob, JobNotificationMixin):
|
||||||
class Meta:
|
class Meta:
|
||||||
|
|||||||
@@ -57,6 +57,7 @@ class WebhookBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
|||||||
|
|
||||||
def send_messages(self, messages):
|
def send_messages(self, messages):
|
||||||
sent_messages = 0
|
sent_messages = 0
|
||||||
|
self.headers['Content-Type'] = 'application/json'
|
||||||
if 'User-Agent' not in self.headers:
|
if 'User-Agent' not in self.headers:
|
||||||
self.headers['User-Agent'] = "Tower {}".format(get_awx_version())
|
self.headers['User-Agent'] = "Tower {}".format(get_awx_version())
|
||||||
if self.http_method.lower() not in ['put','post']:
|
if self.http_method.lower() not in ['put','post']:
|
||||||
@@ -68,7 +69,7 @@ class WebhookBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
|||||||
auth = (self.username, self.password)
|
auth = (self.username, self.password)
|
||||||
r = chosen_method("{}".format(m.recipients()[0]),
|
r = chosen_method("{}".format(m.recipients()[0]),
|
||||||
auth=auth,
|
auth=auth,
|
||||||
json=m.body,
|
data=json.dumps(m.body, ensure_ascii=False).encode('utf-8'),
|
||||||
headers=self.headers,
|
headers=self.headers,
|
||||||
verify=(not self.disable_ssl_verification))
|
verify=(not self.disable_ssl_verification))
|
||||||
if r.status_code >= 400:
|
if r.status_code >= 400:
|
||||||
|
|||||||
@@ -12,6 +12,24 @@ from awx.main.utils.common import parse_yaml_or_json
|
|||||||
logger = logging.getLogger('awx.main.scheduler')
|
logger = logging.getLogger('awx.main.scheduler')
|
||||||
|
|
||||||
|
|
||||||
|
def deepmerge(a, b):
|
||||||
|
"""
|
||||||
|
Merge dict structures and return the result.
|
||||||
|
|
||||||
|
>>> a = {'first': {'all_rows': {'pass': 'dog', 'number': '1'}}}
|
||||||
|
>>> b = {'first': {'all_rows': {'fail': 'cat', 'number': '5'}}}
|
||||||
|
>>> import pprint; pprint.pprint(deepmerge(a, b))
|
||||||
|
{'first': {'all_rows': {'fail': 'cat', 'number': '5', 'pass': 'dog'}}}
|
||||||
|
"""
|
||||||
|
if isinstance(a, dict) and isinstance(b, dict):
|
||||||
|
return dict([(k, deepmerge(a.get(k), b.get(k)))
|
||||||
|
for k in set(a.keys()).union(b.keys())])
|
||||||
|
elif b is None:
|
||||||
|
return a
|
||||||
|
else:
|
||||||
|
return b
|
||||||
|
|
||||||
|
|
||||||
class PodManager(object):
|
class PodManager(object):
|
||||||
|
|
||||||
def __init__(self, task=None):
|
def __init__(self, task=None):
|
||||||
@@ -128,11 +146,13 @@ class PodManager(object):
|
|||||||
pod_spec = {**default_pod_spec, **pod_spec_override}
|
pod_spec = {**default_pod_spec, **pod_spec_override}
|
||||||
|
|
||||||
if self.task:
|
if self.task:
|
||||||
pod_spec['metadata']['name'] = self.pod_name
|
pod_spec['metadata'] = deepmerge(
|
||||||
pod_spec['metadata']['labels'] = {
|
pod_spec.get('metadata', {}),
|
||||||
'ansible-awx': settings.INSTALL_UUID,
|
dict(name=self.pod_name,
|
||||||
'ansible-awx-job-id': str(self.task.id)
|
labels={
|
||||||
}
|
'ansible-awx': settings.INSTALL_UUID,
|
||||||
|
'ansible-awx-job-id': str(self.task.id)
|
||||||
|
}))
|
||||||
pod_spec['spec']['containers'][0]['name'] = self.pod_name
|
pod_spec['spec']['containers'][0]['name'] = self.pod_name
|
||||||
|
|
||||||
return pod_spec
|
return pod_spec
|
||||||
|
|||||||
@@ -7,12 +7,14 @@ import logging
|
|||||||
import uuid
|
import uuid
|
||||||
import json
|
import json
|
||||||
import random
|
import random
|
||||||
|
from types import SimpleNamespace
|
||||||
|
|
||||||
# Django
|
# Django
|
||||||
from django.db import transaction, connection
|
from django.db import transaction, connection
|
||||||
from django.utils.translation import ugettext_lazy as _, gettext_noop
|
from django.utils.translation import ugettext_lazy as _, gettext_noop
|
||||||
from django.utils.timezone import now as tz_now
|
from django.utils.timezone import now as tz_now
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
|
from django.db.models import Q
|
||||||
|
|
||||||
# AWX
|
# AWX
|
||||||
from awx.main.dispatch.reaper import reap_job
|
from awx.main.dispatch.reaper import reap_job
|
||||||
@@ -45,6 +47,15 @@ logger = logging.getLogger('awx.main.scheduler')
|
|||||||
class TaskManager():
|
class TaskManager():
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
|
'''
|
||||||
|
Do NOT put database queries or other potentially expensive operations
|
||||||
|
in the task manager init. The task manager object is created every time a
|
||||||
|
job is created, transitions state, and every 30 seconds on each tower node.
|
||||||
|
More often then not, the object is destroyed quickly because the NOOP case is hit.
|
||||||
|
|
||||||
|
The NOOP case is short-circuit logic. If the task manager realizes that another instance
|
||||||
|
of the task manager is already running, then it short-circuits and decides not to run.
|
||||||
|
'''
|
||||||
self.graph = dict()
|
self.graph = dict()
|
||||||
# start task limit indicates how many pending jobs can be started on this
|
# start task limit indicates how many pending jobs can be started on this
|
||||||
# .schedule() run. Starting jobs is expensive, and there is code in place to reap
|
# .schedule() run. Starting jobs is expensive, and there is code in place to reap
|
||||||
@@ -52,10 +63,30 @@ class TaskManager():
|
|||||||
# 5 minutes to start pending jobs. If this limit is reached, pending jobs
|
# 5 minutes to start pending jobs. If this limit is reached, pending jobs
|
||||||
# will no longer be started and will be started on the next task manager cycle.
|
# will no longer be started and will be started on the next task manager cycle.
|
||||||
self.start_task_limit = settings.START_TASK_LIMIT
|
self.start_task_limit = settings.START_TASK_LIMIT
|
||||||
|
|
||||||
|
def after_lock_init(self):
|
||||||
|
'''
|
||||||
|
Init AFTER we know this instance of the task manager will run because the lock is acquired.
|
||||||
|
'''
|
||||||
|
instances = Instance.objects.filter(~Q(hostname=None), capacity__gt=0, enabled=True)
|
||||||
|
self.real_instances = {i.hostname: i for i in instances}
|
||||||
|
|
||||||
|
instances_partial = [SimpleNamespace(obj=instance,
|
||||||
|
remaining_capacity=instance.remaining_capacity,
|
||||||
|
capacity=instance.capacity,
|
||||||
|
jobs_running=instance.jobs_running,
|
||||||
|
hostname=instance.hostname) for instance in instances]
|
||||||
|
|
||||||
|
instances_by_hostname = {i.hostname: i for i in instances_partial}
|
||||||
|
|
||||||
for rampart_group in InstanceGroup.objects.prefetch_related('instances'):
|
for rampart_group in InstanceGroup.objects.prefetch_related('instances'):
|
||||||
self.graph[rampart_group.name] = dict(graph=DependencyGraph(rampart_group.name),
|
self.graph[rampart_group.name] = dict(graph=DependencyGraph(rampart_group.name),
|
||||||
capacity_total=rampart_group.capacity,
|
capacity_total=rampart_group.capacity,
|
||||||
consumed_capacity=0)
|
consumed_capacity=0,
|
||||||
|
instances=[])
|
||||||
|
for instance in rampart_group.instances.filter(capacity__gt=0, enabled=True).order_by('hostname'):
|
||||||
|
if instance.hostname in instances_by_hostname:
|
||||||
|
self.graph[rampart_group.name]['instances'].append(instances_by_hostname[instance.hostname])
|
||||||
|
|
||||||
def is_job_blocked(self, task):
|
def is_job_blocked(self, task):
|
||||||
# TODO: I'm not happy with this, I think blocking behavior should be decided outside of the dependency graph
|
# TODO: I'm not happy with this, I think blocking behavior should be decided outside of the dependency graph
|
||||||
@@ -254,7 +285,7 @@ class TaskManager():
|
|||||||
for group in InstanceGroup.objects.all():
|
for group in InstanceGroup.objects.all():
|
||||||
if group.is_containerized or group.controller_id:
|
if group.is_containerized or group.controller_id:
|
||||||
continue
|
continue
|
||||||
match = group.fit_task_to_most_remaining_capacity_instance(task)
|
match = group.fit_task_to_most_remaining_capacity_instance(task, group.instances.all())
|
||||||
if match:
|
if match:
|
||||||
break
|
break
|
||||||
task.instance_group = rampart_group
|
task.instance_group = rampart_group
|
||||||
@@ -466,7 +497,6 @@ class TaskManager():
|
|||||||
continue
|
continue
|
||||||
preferred_instance_groups = task.preferred_instance_groups
|
preferred_instance_groups = task.preferred_instance_groups
|
||||||
found_acceptable_queue = False
|
found_acceptable_queue = False
|
||||||
idle_instance_that_fits = None
|
|
||||||
if isinstance(task, WorkflowJob):
|
if isinstance(task, WorkflowJob):
|
||||||
if task.unified_job_template_id in running_workflow_templates:
|
if task.unified_job_template_id in running_workflow_templates:
|
||||||
if not task.allow_simultaneous:
|
if not task.allow_simultaneous:
|
||||||
@@ -483,24 +513,24 @@ class TaskManager():
|
|||||||
found_acceptable_queue = True
|
found_acceptable_queue = True
|
||||||
break
|
break
|
||||||
|
|
||||||
if idle_instance_that_fits is None:
|
|
||||||
idle_instance_that_fits = rampart_group.find_largest_idle_instance()
|
|
||||||
remaining_capacity = self.get_remaining_capacity(rampart_group.name)
|
remaining_capacity = self.get_remaining_capacity(rampart_group.name)
|
||||||
if not rampart_group.is_containerized and self.get_remaining_capacity(rampart_group.name) <= 0:
|
if not rampart_group.is_containerized and self.get_remaining_capacity(rampart_group.name) <= 0:
|
||||||
logger.debug("Skipping group {}, remaining_capacity {} <= 0".format(
|
logger.debug("Skipping group {}, remaining_capacity {} <= 0".format(
|
||||||
rampart_group.name, remaining_capacity))
|
rampart_group.name, remaining_capacity))
|
||||||
continue
|
continue
|
||||||
|
|
||||||
execution_instance = rampart_group.fit_task_to_most_remaining_capacity_instance(task)
|
execution_instance = InstanceGroup.fit_task_to_most_remaining_capacity_instance(task, self.graph[rampart_group.name]['instances']) or \
|
||||||
if execution_instance:
|
InstanceGroup.find_largest_idle_instance(self.graph[rampart_group.name]['instances'])
|
||||||
logger.debug("Starting {} in group {} instance {} (remaining_capacity={})".format(
|
|
||||||
task.log_format, rampart_group.name, execution_instance.hostname, remaining_capacity))
|
if execution_instance or rampart_group.is_containerized:
|
||||||
elif not execution_instance and idle_instance_that_fits:
|
|
||||||
if not rampart_group.is_containerized:
|
if not rampart_group.is_containerized:
|
||||||
execution_instance = idle_instance_that_fits
|
execution_instance.remaining_capacity = max(0, execution_instance.remaining_capacity - task.task_impact)
|
||||||
|
execution_instance.jobs_running += 1
|
||||||
logger.debug("Starting {} in group {} instance {} (remaining_capacity={})".format(
|
logger.debug("Starting {} in group {} instance {} (remaining_capacity={})".format(
|
||||||
task.log_format, rampart_group.name, execution_instance.hostname, remaining_capacity))
|
task.log_format, rampart_group.name, execution_instance.hostname, remaining_capacity))
|
||||||
if execution_instance or rampart_group.is_containerized:
|
|
||||||
|
if execution_instance:
|
||||||
|
execution_instance = self.real_instances[execution_instance.hostname]
|
||||||
self.graph[rampart_group.name]['graph'].add_job(task)
|
self.graph[rampart_group.name]['graph'].add_job(task)
|
||||||
self.start_task(task, rampart_group, task.get_jobs_fail_chain(), execution_instance)
|
self.start_task(task, rampart_group, task.get_jobs_fail_chain(), execution_instance)
|
||||||
found_acceptable_queue = True
|
found_acceptable_queue = True
|
||||||
@@ -572,6 +602,9 @@ class TaskManager():
|
|||||||
def _schedule(self):
|
def _schedule(self):
|
||||||
finished_wfjs = []
|
finished_wfjs = []
|
||||||
all_sorted_tasks = self.get_tasks()
|
all_sorted_tasks = self.get_tasks()
|
||||||
|
|
||||||
|
self.after_lock_init()
|
||||||
|
|
||||||
if len(all_sorted_tasks) > 0:
|
if len(all_sorted_tasks) > 0:
|
||||||
# TODO: Deal with
|
# TODO: Deal with
|
||||||
# latest_project_updates = self.get_latest_project_update_tasks(all_sorted_tasks)
|
# latest_project_updates = self.get_latest_project_update_tasks(all_sorted_tasks)
|
||||||
|
|||||||
@@ -313,7 +313,7 @@ def delete_project_files(project_path):
|
|||||||
|
|
||||||
@task(queue='tower_broadcast_all')
|
@task(queue='tower_broadcast_all')
|
||||||
def profile_sql(threshold=1, minutes=1):
|
def profile_sql(threshold=1, minutes=1):
|
||||||
if threshold == 0:
|
if threshold <= 0:
|
||||||
cache.delete('awx-profile-sql-threshold')
|
cache.delete('awx-profile-sql-threshold')
|
||||||
logger.error('SQL PROFILING DISABLED')
|
logger.error('SQL PROFILING DISABLED')
|
||||||
else:
|
else:
|
||||||
@@ -2160,7 +2160,7 @@ class RunProjectUpdate(BaseTask):
|
|||||||
'local_path': os.path.basename(project_update.project.local_path),
|
'local_path': os.path.basename(project_update.project.local_path),
|
||||||
'project_path': project_update.get_project_path(check_if_exists=False), # deprecated
|
'project_path': project_update.get_project_path(check_if_exists=False), # deprecated
|
||||||
'insights_url': settings.INSIGHTS_URL_BASE,
|
'insights_url': settings.INSIGHTS_URL_BASE,
|
||||||
'awx_license_type': get_license(show_key=False).get('license_type', 'UNLICENSED'),
|
'awx_license_type': get_license().get('license_type', 'UNLICENSED'),
|
||||||
'awx_version': get_awx_version(),
|
'awx_version': get_awx_version(),
|
||||||
'scm_url': scm_url,
|
'scm_url': scm_url,
|
||||||
'scm_branch': scm_branch,
|
'scm_branch': scm_branch,
|
||||||
|
|||||||
@@ -1,34 +0,0 @@
|
|||||||
import pytest
|
|
||||||
import random
|
|
||||||
|
|
||||||
from awx.main.models import Project
|
|
||||||
from awx.main.analytics import collectors
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.django_db
|
|
||||||
def test_empty():
|
|
||||||
assert collectors.projects_by_scm_type(None) == {
|
|
||||||
'manual': 0,
|
|
||||||
'git': 0,
|
|
||||||
'svn': 0,
|
|
||||||
'hg': 0,
|
|
||||||
'insights': 0,
|
|
||||||
'archive': 0,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.django_db
|
|
||||||
@pytest.mark.parametrize('scm_type', [t[0] for t in Project.SCM_TYPE_CHOICES])
|
|
||||||
def test_multiple(scm_type):
|
|
||||||
expected = {
|
|
||||||
'manual': 0,
|
|
||||||
'git': 0,
|
|
||||||
'svn': 0,
|
|
||||||
'hg': 0,
|
|
||||||
'insights': 0,
|
|
||||||
'archive': 0,
|
|
||||||
}
|
|
||||||
for i in range(random.randint(0, 10)):
|
|
||||||
Project(scm_type=scm_type).save()
|
|
||||||
expected[scm_type or 'manual'] += 1
|
|
||||||
assert collectors.projects_by_scm_type(None) == expected
|
|
||||||
@@ -675,33 +675,6 @@ def test_net_create_ok(post, organization, admin):
|
|||||||
assert cred.inputs['authorize'] is True
|
assert cred.inputs['authorize'] is True
|
||||||
|
|
||||||
|
|
||||||
#
|
|
||||||
# Cloudforms Credentials
|
|
||||||
#
|
|
||||||
@pytest.mark.django_db
|
|
||||||
def test_cloudforms_create_ok(post, organization, admin):
|
|
||||||
params = {
|
|
||||||
'credential_type': 1,
|
|
||||||
'name': 'Best credential ever',
|
|
||||||
'inputs': {
|
|
||||||
'host': 'some_host',
|
|
||||||
'username': 'some_username',
|
|
||||||
'password': 'some_password',
|
|
||||||
}
|
|
||||||
}
|
|
||||||
cloudforms = CredentialType.defaults['cloudforms']()
|
|
||||||
cloudforms.save()
|
|
||||||
params['organization'] = organization.id
|
|
||||||
response = post(reverse('api:credential_list'), params, admin)
|
|
||||||
assert response.status_code == 201
|
|
||||||
|
|
||||||
assert Credential.objects.count() == 1
|
|
||||||
cred = Credential.objects.all()[:1].get()
|
|
||||||
assert cred.inputs['host'] == 'some_host'
|
|
||||||
assert cred.inputs['username'] == 'some_username'
|
|
||||||
assert decrypt_field(cred, 'password') == 'some_password'
|
|
||||||
|
|
||||||
|
|
||||||
#
|
#
|
||||||
# GCE Credentials
|
# GCE Credentials
|
||||||
#
|
#
|
||||||
|
|||||||
@@ -99,3 +99,12 @@ def test_changing_overwrite_behavior_okay_if_not_used(post, patch, organization,
|
|||||||
expect=200
|
expect=200
|
||||||
)
|
)
|
||||||
assert Project.objects.get(pk=r1.data['id']).allow_override is False
|
assert Project.objects.get(pk=r1.data['id']).allow_override is False
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.django_db
|
||||||
|
def test_scm_project_local_path_invalid(get, patch, project, admin):
|
||||||
|
url = reverse('api:project_detail', kwargs={'pk': project.id})
|
||||||
|
resp = patch(url, {'local_path': '/foo/bar'}, user=admin, expect=400)
|
||||||
|
assert resp.data['local_path'] == [
|
||||||
|
'Cannot change local_path for git-based projects'
|
||||||
|
]
|
||||||
|
|||||||
@@ -282,10 +282,6 @@ def test_prefetch_ujt_project_capabilities(alice, project, job_template, mocker)
|
|||||||
list_serializer.child.to_representation(project)
|
list_serializer.child.to_representation(project)
|
||||||
assert 'capability_map' not in list_serializer.child.context
|
assert 'capability_map' not in list_serializer.child.context
|
||||||
|
|
||||||
# Models for which the prefetch is valid for do
|
|
||||||
list_serializer.child.to_representation(job_template)
|
|
||||||
assert set(list_serializer.child.context['capability_map'][job_template.id].keys()) == set(('copy', 'edit', 'start'))
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.django_db
|
@pytest.mark.django_db
|
||||||
def test_prefetch_group_capabilities(group, rando):
|
def test_prefetch_group_capabilities(group, rando):
|
||||||
|
|||||||
@@ -349,7 +349,7 @@ def test_months_with_31_days(post, admin_user):
|
|||||||
('MINUTELY', 1, 60),
|
('MINUTELY', 1, 60),
|
||||||
('MINUTELY', 15, 15 * 60),
|
('MINUTELY', 15, 15 * 60),
|
||||||
('HOURLY', 1, 3600),
|
('HOURLY', 1, 3600),
|
||||||
('HOURLY', 4, 3600 * 4),
|
('HOURLY', 2, 3600 * 2),
|
||||||
))
|
))
|
||||||
def test_really_old_dtstart(post, admin_user, freq, delta, total_seconds):
|
def test_really_old_dtstart(post, admin_user, freq, delta, total_seconds):
|
||||||
url = reverse('api:schedule_rrule')
|
url = reverse('api:schedule_rrule')
|
||||||
|
|||||||
@@ -89,7 +89,7 @@ class TestApprovalNodes():
|
|||||||
url = reverse('api:workflow_job_template_node_create_approval',
|
url = reverse('api:workflow_job_template_node_create_approval',
|
||||||
kwargs={'pk': approval_node.pk, 'version': 'v2'})
|
kwargs={'pk': approval_node.pk, 'version': 'v2'})
|
||||||
post(url, {'name': 'Test', 'description': 'Approval Node', 'timeout': 0},
|
post(url, {'name': 'Test', 'description': 'Approval Node', 'timeout': 0},
|
||||||
user=admin_user, expect=200)
|
user=admin_user, expect=201)
|
||||||
|
|
||||||
approval_node = WorkflowJobTemplateNode.objects.get(pk=approval_node.pk)
|
approval_node = WorkflowJobTemplateNode.objects.get(pk=approval_node.pk)
|
||||||
assert isinstance(approval_node.unified_job_template, WorkflowApprovalTemplate)
|
assert isinstance(approval_node.unified_job_template, WorkflowApprovalTemplate)
|
||||||
@@ -108,9 +108,9 @@ class TestApprovalNodes():
|
|||||||
assert {'name': ['This field may not be blank.']} == json.loads(r.content)
|
assert {'name': ['This field may not be blank.']} == json.loads(r.content)
|
||||||
|
|
||||||
@pytest.mark.parametrize("is_admin, is_org_admin, status", [
|
@pytest.mark.parametrize("is_admin, is_org_admin, status", [
|
||||||
[True, False, 200], # if they're a WFJT admin, they get a 200
|
[True, False, 201], # if they're a WFJT admin, they get a 201
|
||||||
[False, False, 403], # if they're not a WFJT *nor* org admin, they get a 403
|
[False, False, 403], # if they're not a WFJT *nor* org admin, they get a 403
|
||||||
[False, True, 200], # if they're an organization admin, they get a 200
|
[False, True, 201], # if they're an organization admin, they get a 201
|
||||||
])
|
])
|
||||||
def test_approval_node_creation_rbac(self, post, approval_node, alice, is_admin, is_org_admin, status):
|
def test_approval_node_creation_rbac(self, post, approval_node, alice, is_admin, is_org_admin, status):
|
||||||
url = reverse('api:workflow_job_template_node_create_approval',
|
url = reverse('api:workflow_job_template_node_create_approval',
|
||||||
@@ -165,7 +165,7 @@ class TestApprovalNodes():
|
|||||||
url = reverse('api:workflow_job_template_node_create_approval',
|
url = reverse('api:workflow_job_template_node_create_approval',
|
||||||
kwargs={'pk': node.pk, 'version': 'v2'})
|
kwargs={'pk': node.pk, 'version': 'v2'})
|
||||||
post(url, {'name': 'Approve Test', 'description': '', 'timeout': 0},
|
post(url, {'name': 'Approve Test', 'description': '', 'timeout': 0},
|
||||||
user=admin_user, expect=200)
|
user=admin_user, expect=201)
|
||||||
post(reverse('api:workflow_job_template_launch', kwargs={'pk': wfjt.pk}),
|
post(reverse('api:workflow_job_template_launch', kwargs={'pk': wfjt.pk}),
|
||||||
user=admin_user, expect=201)
|
user=admin_user, expect=201)
|
||||||
wf_job = WorkflowJob.objects.first()
|
wf_job = WorkflowJob.objects.first()
|
||||||
@@ -195,7 +195,7 @@ class TestApprovalNodes():
|
|||||||
url = reverse('api:workflow_job_template_node_create_approval',
|
url = reverse('api:workflow_job_template_node_create_approval',
|
||||||
kwargs={'pk': node.pk, 'version': 'v2'})
|
kwargs={'pk': node.pk, 'version': 'v2'})
|
||||||
post(url, {'name': 'Deny Test', 'description': '', 'timeout': 0},
|
post(url, {'name': 'Deny Test', 'description': '', 'timeout': 0},
|
||||||
user=admin_user, expect=200)
|
user=admin_user, expect=201)
|
||||||
post(reverse('api:workflow_job_template_launch', kwargs={'pk': wfjt.pk}),
|
post(reverse('api:workflow_job_template_launch', kwargs={'pk': wfjt.pk}),
|
||||||
user=admin_user, expect=201)
|
user=admin_user, expect=201)
|
||||||
wf_job = WorkflowJob.objects.first()
|
wf_job = WorkflowJob.objects.first()
|
||||||
|
|||||||
@@ -1,13 +0,0 @@
|
|||||||
# Copyright (c) 2015 Ansible, Inc.
|
|
||||||
# All Rights Reserved.
|
|
||||||
|
|
||||||
from awx.main.utils.common import StubLicense
|
|
||||||
|
|
||||||
|
|
||||||
def test_stub_license():
|
|
||||||
license_actual = StubLicense().validate()
|
|
||||||
assert license_actual['license_key'] == 'OPEN'
|
|
||||||
assert license_actual['valid_key']
|
|
||||||
assert license_actual['compliant']
|
|
||||||
assert license_actual['license_type'] == 'open'
|
|
||||||
|
|
||||||
@@ -79,7 +79,6 @@ def test_default_cred_types():
|
|||||||
'aws',
|
'aws',
|
||||||
'azure_kv',
|
'azure_kv',
|
||||||
'azure_rm',
|
'azure_rm',
|
||||||
'cloudforms',
|
|
||||||
'conjur',
|
'conjur',
|
||||||
'galaxy_api_token',
|
'galaxy_api_token',
|
||||||
'gce',
|
'gce',
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ from awx.main.migrations import _inventory_source as invsrc
|
|||||||
|
|
||||||
from django.apps import apps
|
from django.apps import apps
|
||||||
|
|
||||||
from awx.main.models import InventorySource
|
from awx.main.models import InventorySource, InventoryUpdate, ManagedCredentialType, CredentialType, Credential
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('vars,id_var,result', [
|
@pytest.mark.parametrize('vars,id_var,result', [
|
||||||
@@ -42,16 +42,40 @@ def test_apply_new_instance_id(inventory_source):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.django_db
|
@pytest.mark.django_db
|
||||||
def test_replacement_scm_sources(inventory):
|
def test_cloudforms_inventory_removal(inventory):
|
||||||
inv_source = InventorySource.objects.create(
|
ManagedCredentialType(
|
||||||
name='test',
|
name='Red Hat CloudForms',
|
||||||
inventory=inventory,
|
namespace='cloudforms',
|
||||||
organization=inventory.organization,
|
kind='cloud',
|
||||||
source='ec2'
|
managed_by_tower=True,
|
||||||
|
inputs={},
|
||||||
)
|
)
|
||||||
invsrc.create_scm_script_substitute(apps, 'ec2')
|
CredentialType.defaults['cloudforms']().save()
|
||||||
inv_source.refresh_from_db()
|
cloudforms = CredentialType.objects.get(namespace='cloudforms')
|
||||||
assert inv_source.source == 'scm'
|
Credential.objects.create(
|
||||||
assert inv_source.source_project
|
name='test',
|
||||||
project = inv_source.source_project
|
credential_type=cloudforms,
|
||||||
assert 'Replacement project for' in project.name
|
)
|
||||||
|
|
||||||
|
for source in ('ec2', 'cloudforms'):
|
||||||
|
i = InventorySource.objects.create(
|
||||||
|
name='test',
|
||||||
|
inventory=inventory,
|
||||||
|
organization=inventory.organization,
|
||||||
|
source=source,
|
||||||
|
)
|
||||||
|
InventoryUpdate.objects.create(
|
||||||
|
name='test update',
|
||||||
|
inventory_source=i,
|
||||||
|
source=source,
|
||||||
|
)
|
||||||
|
assert Credential.objects.count() == 1
|
||||||
|
assert InventorySource.objects.count() == 2 # ec2 + cf
|
||||||
|
assert InventoryUpdate.objects.count() == 2 # ec2 + cf
|
||||||
|
invsrc.delete_cloudforms_inv_source(apps, None)
|
||||||
|
assert InventorySource.objects.count() == 1 # ec2
|
||||||
|
assert InventoryUpdate.objects.count() == 1 # ec2
|
||||||
|
assert InventorySource.objects.first().source == 'ec2'
|
||||||
|
assert InventoryUpdate.objects.first().source == 'ec2'
|
||||||
|
assert Credential.objects.count() == 0
|
||||||
|
assert CredentialType.objects.filter(namespace='cloudforms').exists() is False
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
|
|
||||||
import glob
|
import glob
|
||||||
import json
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
@@ -30,8 +29,7 @@ def test_python_and_js_licenses():
|
|||||||
# Check variations of '-' and '_' in filenames due to python
|
# Check variations of '-' and '_' in filenames due to python
|
||||||
for fname in [name, name.replace('-','_')]:
|
for fname in [name, name.replace('-','_')]:
|
||||||
if entry.startswith(fname) and entry.endswith('.tar.gz'):
|
if entry.startswith(fname) and entry.endswith('.tar.gz'):
|
||||||
entry = entry[:-7]
|
v = entry.split(name + '-')[1].split('.tar.gz')[0]
|
||||||
(n, v) = entry.rsplit('-',1)
|
|
||||||
return v
|
return v
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@@ -66,28 +64,6 @@ def test_python_and_js_licenses():
|
|||||||
ret[name] = { 'name': name, 'version': version}
|
ret[name] = { 'name': name, 'version': version}
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
|
||||||
def read_ui_requirements(path):
|
|
||||||
def json_deps(jsondata):
|
|
||||||
ret = {}
|
|
||||||
deps = jsondata.get('dependencies',{})
|
|
||||||
for key in deps.keys():
|
|
||||||
key = key.lower()
|
|
||||||
devonly = deps[key].get('dev',False)
|
|
||||||
if not devonly:
|
|
||||||
if key not in ret.keys():
|
|
||||||
depname = key.replace('/','-')
|
|
||||||
ret[depname] = {
|
|
||||||
'name': depname,
|
|
||||||
'version': deps[key]['version']
|
|
||||||
}
|
|
||||||
ret.update(json_deps(deps[key]))
|
|
||||||
return ret
|
|
||||||
|
|
||||||
with open('%s/package-lock.json' % path) as f:
|
|
||||||
jsondata = json.load(f)
|
|
||||||
return json_deps(jsondata)
|
|
||||||
|
|
||||||
def remediate_licenses_and_requirements(licenses, requirements):
|
def remediate_licenses_and_requirements(licenses, requirements):
|
||||||
errors = []
|
errors = []
|
||||||
items = list(licenses.keys())
|
items = list(licenses.keys())
|
||||||
@@ -114,12 +90,9 @@ def test_python_and_js_licenses():
|
|||||||
|
|
||||||
base_dir = settings.BASE_DIR
|
base_dir = settings.BASE_DIR
|
||||||
api_licenses = index_licenses('%s/../docs/licenses' % base_dir)
|
api_licenses = index_licenses('%s/../docs/licenses' % base_dir)
|
||||||
ui_licenses = index_licenses('%s/../docs/licenses/ui' % base_dir)
|
|
||||||
api_requirements = read_api_requirements('%s/../requirements' % base_dir)
|
api_requirements = read_api_requirements('%s/../requirements' % base_dir)
|
||||||
ui_requirements = read_ui_requirements('%s/ui' % base_dir)
|
|
||||||
|
|
||||||
errors = []
|
errors = []
|
||||||
errors += remediate_licenses_and_requirements(ui_licenses, ui_requirements)
|
|
||||||
errors += remediate_licenses_and_requirements(api_licenses, api_requirements)
|
errors += remediate_licenses_and_requirements(api_licenses, api_requirements)
|
||||||
if errors:
|
if errors:
|
||||||
raise Exception('Included licenses not consistent with requirements:\n%s' %
|
raise Exception('Included licenses not consistent with requirements:\n%s' %
|
||||||
|
|||||||
@@ -45,19 +45,14 @@ class TestInstanceGroup(object):
|
|||||||
(T(100), Is([50, 0, 20, 99, 11, 1, 5, 99]), None, "The task don't a fit, you must a quit!"),
|
(T(100), Is([50, 0, 20, 99, 11, 1, 5, 99]), None, "The task don't a fit, you must a quit!"),
|
||||||
])
|
])
|
||||||
def test_fit_task_to_most_remaining_capacity_instance(self, task, instances, instance_fit_index, reason):
|
def test_fit_task_to_most_remaining_capacity_instance(self, task, instances, instance_fit_index, reason):
|
||||||
with mock.patch.object(InstanceGroup,
|
ig = InstanceGroup(id=10)
|
||||||
'instances',
|
|
||||||
Mock(spec_set=['filter'],
|
|
||||||
filter=lambda *args, **kargs: Mock(spec_set=['order_by'],
|
|
||||||
order_by=lambda x: instances))):
|
|
||||||
ig = InstanceGroup(id=10)
|
|
||||||
|
|
||||||
if instance_fit_index is None:
|
instance_picked = ig.fit_task_to_most_remaining_capacity_instance(task, instances)
|
||||||
assert ig.fit_task_to_most_remaining_capacity_instance(task) is None, reason
|
|
||||||
else:
|
|
||||||
assert ig.fit_task_to_most_remaining_capacity_instance(task) == \
|
|
||||||
instances[instance_fit_index], reason
|
|
||||||
|
|
||||||
|
if instance_fit_index is None:
|
||||||
|
assert instance_picked is None, reason
|
||||||
|
else:
|
||||||
|
assert instance_picked == instances[instance_fit_index], reason
|
||||||
|
|
||||||
@pytest.mark.parametrize('instances,instance_fit_index,reason', [
|
@pytest.mark.parametrize('instances,instance_fit_index,reason', [
|
||||||
(Is([(0, 100)]), 0, "One idle instance, pick it"),
|
(Is([(0, 100)]), 0, "One idle instance, pick it"),
|
||||||
@@ -70,16 +65,12 @@ class TestInstanceGroup(object):
|
|||||||
def filter_offline_instances(*args):
|
def filter_offline_instances(*args):
|
||||||
return filter(lambda i: i.capacity > 0, instances)
|
return filter(lambda i: i.capacity > 0, instances)
|
||||||
|
|
||||||
with mock.patch.object(InstanceGroup,
|
ig = InstanceGroup(id=10)
|
||||||
'instances',
|
instances_online_only = filter_offline_instances(instances)
|
||||||
Mock(spec_set=['filter'],
|
|
||||||
filter=lambda *args, **kargs: Mock(spec_set=['order_by'],
|
|
||||||
order_by=filter_offline_instances))):
|
|
||||||
ig = InstanceGroup(id=10)
|
|
||||||
|
|
||||||
if instance_fit_index is None:
|
if instance_fit_index is None:
|
||||||
assert ig.find_largest_idle_instance() is None, reason
|
assert ig.find_largest_idle_instance(instances_online_only) is None, reason
|
||||||
else:
|
else:
|
||||||
assert ig.find_largest_idle_instance() == \
|
assert ig.find_largest_idle_instance(instances_online_only) == \
|
||||||
instances[instance_fit_index], reason
|
instances[instance_fit_index], reason
|
||||||
|
|
||||||
|
|||||||
@@ -39,6 +39,8 @@ from awx.main import tasks
|
|||||||
from awx.main.utils import encrypt_field, encrypt_value
|
from awx.main.utils import encrypt_field, encrypt_value
|
||||||
from awx.main.utils.safe_yaml import SafeLoader
|
from awx.main.utils.safe_yaml import SafeLoader
|
||||||
|
|
||||||
|
from awx.main.utils.licensing import Licenser
|
||||||
|
|
||||||
|
|
||||||
class TestJobExecution(object):
|
class TestJobExecution(object):
|
||||||
EXAMPLE_PRIVATE_KEY = '-----BEGIN PRIVATE KEY-----\nxyz==\n-----END PRIVATE KEY-----'
|
EXAMPLE_PRIVATE_KEY = '-----BEGIN PRIVATE KEY-----\nxyz==\n-----END PRIVATE KEY-----'
|
||||||
@@ -1830,7 +1832,10 @@ class TestProjectUpdateGalaxyCredentials(TestJobExecution):
|
|||||||
|
|
||||||
task = RunProjectUpdate()
|
task = RunProjectUpdate()
|
||||||
env = task.build_env(project_update, private_data_dir)
|
env = task.build_env(project_update, private_data_dir)
|
||||||
task.build_extra_vars_file(project_update, private_data_dir)
|
|
||||||
|
with mock.patch.object(Licenser, 'validate', lambda *args, **kw: {}):
|
||||||
|
task.build_extra_vars_file(project_update, private_data_dir)
|
||||||
|
|
||||||
assert task.__vars__['roles_enabled'] is False
|
assert task.__vars__['roles_enabled'] is False
|
||||||
assert task.__vars__['collections_enabled'] is False
|
assert task.__vars__['collections_enabled'] is False
|
||||||
for k in env:
|
for k in env:
|
||||||
@@ -1850,7 +1855,10 @@ class TestProjectUpdateGalaxyCredentials(TestJobExecution):
|
|||||||
project_update.project.organization.galaxy_credentials.add(public_galaxy)
|
project_update.project.organization.galaxy_credentials.add(public_galaxy)
|
||||||
task = RunProjectUpdate()
|
task = RunProjectUpdate()
|
||||||
env = task.build_env(project_update, private_data_dir)
|
env = task.build_env(project_update, private_data_dir)
|
||||||
task.build_extra_vars_file(project_update, private_data_dir)
|
|
||||||
|
with mock.patch.object(Licenser, 'validate', lambda *args, **kw: {}):
|
||||||
|
task.build_extra_vars_file(project_update, private_data_dir)
|
||||||
|
|
||||||
assert task.__vars__['roles_enabled'] is True
|
assert task.__vars__['roles_enabled'] is True
|
||||||
assert task.__vars__['collections_enabled'] is True
|
assert task.__vars__['collections_enabled'] is True
|
||||||
assert sorted([
|
assert sorted([
|
||||||
@@ -1935,7 +1943,9 @@ class TestProjectUpdateCredentials(TestJobExecution):
|
|||||||
assert settings.PROJECTS_ROOT in process_isolation['process_isolation_show_paths']
|
assert settings.PROJECTS_ROOT in process_isolation['process_isolation_show_paths']
|
||||||
|
|
||||||
task._write_extra_vars_file = mock.Mock()
|
task._write_extra_vars_file = mock.Mock()
|
||||||
task.build_extra_vars_file(project_update, private_data_dir)
|
|
||||||
|
with mock.patch.object(Licenser, 'validate', lambda *args, **kw: {}):
|
||||||
|
task.build_extra_vars_file(project_update, private_data_dir)
|
||||||
|
|
||||||
call_args, _ = task._write_extra_vars_file.call_args_list[0]
|
call_args, _ = task._write_extra_vars_file.call_args_list[0]
|
||||||
_, extra_vars = call_args
|
_, extra_vars = call_args
|
||||||
@@ -2140,10 +2150,6 @@ class TestInventoryUpdateCredentials(TestJobExecution):
|
|||||||
return cred
|
return cred
|
||||||
inventory_update.get_cloud_credential = get_cred
|
inventory_update.get_cloud_credential = get_cred
|
||||||
inventory_update.get_extra_credentials = mocker.Mock(return_value=[])
|
inventory_update.get_extra_credentials = mocker.Mock(return_value=[])
|
||||||
inventory_update.source_vars = {
|
|
||||||
'include_powerstate': 'yes',
|
|
||||||
'group_by_resource_group': 'no'
|
|
||||||
}
|
|
||||||
|
|
||||||
private_data_files = task.build_private_data_files(inventory_update, private_data_dir)
|
private_data_files = task.build_private_data_files(inventory_update, private_data_dir)
|
||||||
env = task.build_env(inventory_update, private_data_dir, False, private_data_files)
|
env = task.build_env(inventory_update, private_data_dir, False, private_data_files)
|
||||||
@@ -2177,11 +2183,6 @@ class TestInventoryUpdateCredentials(TestJobExecution):
|
|||||||
return cred
|
return cred
|
||||||
inventory_update.get_cloud_credential = get_cred
|
inventory_update.get_cloud_credential = get_cred
|
||||||
inventory_update.get_extra_credentials = mocker.Mock(return_value=[])
|
inventory_update.get_extra_credentials = mocker.Mock(return_value=[])
|
||||||
inventory_update.source_vars = {
|
|
||||||
'include_powerstate': 'yes',
|
|
||||||
'group_by_resource_group': 'no',
|
|
||||||
'group_by_security_group': 'no'
|
|
||||||
}
|
|
||||||
|
|
||||||
private_data_files = task.build_private_data_files(inventory_update, private_data_dir)
|
private_data_files = task.build_private_data_files(inventory_update, private_data_dir)
|
||||||
env = task.build_env(inventory_update, private_data_dir, False, private_data_files)
|
env = task.build_env(inventory_update, private_data_dir, False, private_data_files)
|
||||||
@@ -2296,21 +2297,14 @@ class TestInventoryUpdateCredentials(TestJobExecution):
|
|||||||
inventory_update.get_cloud_credential = get_cred
|
inventory_update.get_cloud_credential = get_cred
|
||||||
inventory_update.get_extra_credentials = mocker.Mock(return_value=[])
|
inventory_update.get_extra_credentials = mocker.Mock(return_value=[])
|
||||||
|
|
||||||
inventory_update.source_vars = {
|
|
||||||
'satellite6_group_patterns': '[a,b,c]',
|
|
||||||
'satellite6_group_prefix': 'hey_',
|
|
||||||
'satellite6_want_hostcollections': True,
|
|
||||||
'satellite6_want_ansible_ssh_host': True,
|
|
||||||
'satellite6_rich_params': True,
|
|
||||||
'satellite6_want_facts': False
|
|
||||||
}
|
|
||||||
|
|
||||||
private_data_files = task.build_private_data_files(inventory_update, private_data_dir)
|
private_data_files = task.build_private_data_files(inventory_update, private_data_dir)
|
||||||
env = task.build_env(inventory_update, private_data_dir, False, private_data_files)
|
env = task.build_env(inventory_update, private_data_dir, False, private_data_files)
|
||||||
|
safe_env = build_safe_env(env)
|
||||||
|
|
||||||
env["FOREMAN_SERVER"] == "https://example.org",
|
assert env["FOREMAN_SERVER"] == "https://example.org"
|
||||||
env["FOREMAN_USER"] == "bob",
|
assert env["FOREMAN_USER"] == "bob"
|
||||||
env["FOREMAN_PASSWORD"] == "secret",
|
assert env["FOREMAN_PASSWORD"] == "secret"
|
||||||
|
assert safe_env["FOREMAN_PASSWORD"] == tasks.HIDDEN_PASSWORD
|
||||||
|
|
||||||
@pytest.mark.parametrize('verify', [True, False])
|
@pytest.mark.parametrize('verify', [True, False])
|
||||||
def test_tower_source(self, verify, inventory_update, private_data_dir, mocker):
|
def test_tower_source(self, verify, inventory_update, private_data_dir, mocker):
|
||||||
|
|||||||
@@ -55,8 +55,7 @@ __all__ = [
|
|||||||
'model_instance_diff', 'parse_yaml_or_json', 'RequireDebugTrueOrTest',
|
'model_instance_diff', 'parse_yaml_or_json', 'RequireDebugTrueOrTest',
|
||||||
'has_model_field_prefetched', 'set_environ', 'IllegalArgumentError',
|
'has_model_field_prefetched', 'set_environ', 'IllegalArgumentError',
|
||||||
'get_custom_venv_choices', 'get_external_account', 'task_manager_bulk_reschedule',
|
'get_custom_venv_choices', 'get_external_account', 'task_manager_bulk_reschedule',
|
||||||
'schedule_task_manager', 'classproperty', 'create_temporary_fifo', 'truncate_stdout',
|
'schedule_task_manager', 'classproperty', 'create_temporary_fifo', 'truncate_stdout'
|
||||||
'StubLicense'
|
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
@@ -190,7 +189,7 @@ def get_awx_version():
|
|||||||
|
|
||||||
|
|
||||||
def get_awx_http_client_headers():
|
def get_awx_http_client_headers():
|
||||||
license = get_license(show_key=False).get('license_type', 'UNLICENSED')
|
license = get_license().get('license_type', 'UNLICENSED')
|
||||||
headers = {
|
headers = {
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
'User-Agent': '{} {} ({})'.format(
|
'User-Agent': '{} {} ({})'.format(
|
||||||
@@ -202,34 +201,15 @@ def get_awx_http_client_headers():
|
|||||||
return headers
|
return headers
|
||||||
|
|
||||||
|
|
||||||
class StubLicense(object):
|
|
||||||
|
|
||||||
features = {
|
|
||||||
'activity_streams': True,
|
|
||||||
'ha': True,
|
|
||||||
'ldap': True,
|
|
||||||
'multiple_organizations': True,
|
|
||||||
'surveys': True,
|
|
||||||
'system_tracking': True,
|
|
||||||
'rebranding': True,
|
|
||||||
'enterprise_auth': True,
|
|
||||||
'workflows': True,
|
|
||||||
}
|
|
||||||
|
|
||||||
def validate(self):
|
|
||||||
return dict(license_key='OPEN',
|
|
||||||
valid_key=True,
|
|
||||||
compliant=True,
|
|
||||||
features=self.features,
|
|
||||||
license_type='open')
|
|
||||||
|
|
||||||
|
|
||||||
def get_licenser(*args, **kwargs):
|
def get_licenser(*args, **kwargs):
|
||||||
|
from awx.main.utils.licensing import Licenser, OpenLicense
|
||||||
try:
|
try:
|
||||||
from tower_license import TowerLicense
|
if os.path.exists('/var/lib/awx/.tower_version'):
|
||||||
return TowerLicense(*args, **kwargs)
|
return Licenser(*args, **kwargs)
|
||||||
except ImportError:
|
else:
|
||||||
return StubLicense(*args, **kwargs)
|
return OpenLicense()
|
||||||
|
except Exception as e:
|
||||||
|
raise ValueError(_('Error importing Tower License: %s') % e)
|
||||||
|
|
||||||
|
|
||||||
def update_scm_url(scm_type, url, username=True, password=True,
|
def update_scm_url(scm_type, url, username=True, password=True,
|
||||||
|
|||||||
393
awx/main/utils/licensing.py
Normal file
@@ -0,0 +1,393 @@
|
|||||||
|
# Copyright (c) 2015 Ansible, Inc.
|
||||||
|
# All Rights Reserved.
|
||||||
|
|
||||||
|
'''
|
||||||
|
This is intended to be a lightweight license class for verifying subscriptions, and parsing subscription data
|
||||||
|
from entitlement certificates.
|
||||||
|
|
||||||
|
The Licenser class can do the following:
|
||||||
|
- Parse an Entitlement cert to generate license
|
||||||
|
'''
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import configparser
|
||||||
|
from datetime import datetime
|
||||||
|
import collections
|
||||||
|
import copy
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import re
|
||||||
|
import requests
|
||||||
|
import time
|
||||||
|
import zipfile
|
||||||
|
|
||||||
|
from dateutil.parser import parse as parse_date
|
||||||
|
|
||||||
|
from cryptography.exceptions import InvalidSignature
|
||||||
|
from cryptography.hazmat.primitives import hashes
|
||||||
|
from cryptography.hazmat.backends import default_backend
|
||||||
|
from cryptography.hazmat.primitives.asymmetric import padding
|
||||||
|
from cryptography import x509
|
||||||
|
|
||||||
|
# Django
|
||||||
|
from django.conf import settings
|
||||||
|
from django.utils.translation import ugettext_lazy as _
|
||||||
|
|
||||||
|
# AWX
|
||||||
|
from awx.main.models import Host
|
||||||
|
|
||||||
|
MAX_INSTANCES = 9999999
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def rhsm_config():
|
||||||
|
path = '/etc/rhsm/rhsm.conf'
|
||||||
|
config = configparser.ConfigParser()
|
||||||
|
config.read(path)
|
||||||
|
return config
|
||||||
|
|
||||||
|
|
||||||
|
def validate_entitlement_manifest(data):
|
||||||
|
buff = io.BytesIO()
|
||||||
|
buff.write(base64.b64decode(data))
|
||||||
|
try:
|
||||||
|
z = zipfile.ZipFile(buff)
|
||||||
|
except zipfile.BadZipFile as e:
|
||||||
|
raise ValueError(_("Invalid manifest: a subscription manifest zip file is required.")) from e
|
||||||
|
buff = io.BytesIO()
|
||||||
|
|
||||||
|
files = z.namelist()
|
||||||
|
if 'consumer_export.zip' not in files or 'signature' not in files:
|
||||||
|
raise ValueError(_("Invalid manifest: missing required files."))
|
||||||
|
export = z.open('consumer_export.zip').read()
|
||||||
|
sig = z.open('signature').read()
|
||||||
|
with open('/etc/tower/candlepin-redhat-ca.crt', 'rb') as f:
|
||||||
|
cert = x509.load_pem_x509_certificate(f.read(), backend=default_backend())
|
||||||
|
key = cert.public_key()
|
||||||
|
try:
|
||||||
|
key.verify(sig, export, padding=padding.PKCS1v15(), algorithm=hashes.SHA256())
|
||||||
|
except InvalidSignature as e:
|
||||||
|
raise ValueError(_("Invalid manifest: signature verification failed.")) from e
|
||||||
|
|
||||||
|
buff.write(export)
|
||||||
|
z = zipfile.ZipFile(buff)
|
||||||
|
for f in z.filelist:
|
||||||
|
if f.filename.startswith('export/entitlements') and f.filename.endswith('.json'):
|
||||||
|
return json.loads(z.open(f).read())
|
||||||
|
raise ValueError(_("Invalid manifest: manifest contains no subscriptions."))
|
||||||
|
|
||||||
|
|
||||||
|
class OpenLicense(object):
|
||||||
|
def validate(self):
|
||||||
|
return dict(
|
||||||
|
license_type='open',
|
||||||
|
valid_key=True,
|
||||||
|
subscription_name='OPEN',
|
||||||
|
product_name="AWX",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class Licenser(object):
|
||||||
|
# warn when there is a month (30 days) left on the subscription
|
||||||
|
SUBSCRIPTION_TIMEOUT = 60 * 60 * 24 * 30
|
||||||
|
|
||||||
|
UNLICENSED_DATA = dict(
|
||||||
|
subscription_name=None,
|
||||||
|
sku=None,
|
||||||
|
support_level=None,
|
||||||
|
instance_count=0,
|
||||||
|
license_date=0,
|
||||||
|
license_type="UNLICENSED",
|
||||||
|
product_name="Red Hat Ansible Automation Platform",
|
||||||
|
valid_key=False
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
self._attrs = dict(
|
||||||
|
instance_count=0,
|
||||||
|
license_date=0,
|
||||||
|
license_type='UNLICENSED',
|
||||||
|
)
|
||||||
|
self.config = rhsm_config()
|
||||||
|
if not kwargs:
|
||||||
|
license_setting = getattr(settings, 'LICENSE', None)
|
||||||
|
if license_setting is not None:
|
||||||
|
kwargs = license_setting
|
||||||
|
|
||||||
|
if 'company_name' in kwargs:
|
||||||
|
kwargs.pop('company_name')
|
||||||
|
self._attrs.update(kwargs)
|
||||||
|
if 'valid_key' in self._attrs:
|
||||||
|
if not self._attrs['valid_key']:
|
||||||
|
self._unset_attrs()
|
||||||
|
else:
|
||||||
|
self._unset_attrs()
|
||||||
|
|
||||||
|
|
||||||
|
def _unset_attrs(self):
|
||||||
|
self._attrs = self.UNLICENSED_DATA.copy()
|
||||||
|
|
||||||
|
|
||||||
|
def license_from_manifest(self, manifest):
|
||||||
|
# Parse output for subscription metadata to build config
|
||||||
|
license = dict()
|
||||||
|
license['sku'] = manifest['pool']['productId']
|
||||||
|
try:
|
||||||
|
license['instance_count'] = manifest['pool']['exported']
|
||||||
|
except KeyError:
|
||||||
|
license['instance_count'] = manifest['pool']['quantity']
|
||||||
|
license['subscription_name'] = manifest['pool']['productName']
|
||||||
|
license['pool_id'] = manifest['pool']['id']
|
||||||
|
license['license_date'] = parse_date(manifest['endDate']).strftime('%s')
|
||||||
|
license['product_name'] = manifest['pool']['productName']
|
||||||
|
license['valid_key'] = True
|
||||||
|
license['license_type'] = 'enterprise'
|
||||||
|
license['satellite'] = False
|
||||||
|
|
||||||
|
self._attrs.update(license)
|
||||||
|
settings.LICENSE = self._attrs
|
||||||
|
return self._attrs
|
||||||
|
|
||||||
|
|
||||||
|
def update(self, **kwargs):
|
||||||
|
# Update attributes of the current license.
|
||||||
|
if 'instance_count' in kwargs:
|
||||||
|
kwargs['instance_count'] = int(kwargs['instance_count'])
|
||||||
|
if 'license_date' in kwargs:
|
||||||
|
kwargs['license_date'] = int(kwargs['license_date'])
|
||||||
|
self._attrs.update(kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def validate_rh(self, user, pw):
|
||||||
|
try:
|
||||||
|
host = 'https://' + str(self.config.get("server", "hostname"))
|
||||||
|
except Exception:
|
||||||
|
logger.exception('Cannot access rhsm.conf, make sure subscription manager is installed and configured.')
|
||||||
|
host = None
|
||||||
|
if not host:
|
||||||
|
host = getattr(settings, 'REDHAT_CANDLEPIN_HOST', None)
|
||||||
|
|
||||||
|
if not user:
|
||||||
|
raise ValueError('subscriptions_username is required')
|
||||||
|
|
||||||
|
if not pw:
|
||||||
|
raise ValueError('subscriptions_password is required')
|
||||||
|
|
||||||
|
if host and user and pw:
|
||||||
|
if 'subscription.rhsm.redhat.com' in host:
|
||||||
|
json = self.get_rhsm_subs(host, user, pw)
|
||||||
|
else:
|
||||||
|
json = self.get_satellite_subs(host, user, pw)
|
||||||
|
return self.generate_license_options_from_entitlements(json)
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
def get_rhsm_subs(self, host, user, pw):
|
||||||
|
verify = getattr(settings, 'REDHAT_CANDLEPIN_VERIFY', True)
|
||||||
|
json = []
|
||||||
|
try:
|
||||||
|
subs = requests.get(
|
||||||
|
'/'.join([host, 'subscription/users/{}/owners'.format(user)]),
|
||||||
|
verify=verify,
|
||||||
|
auth=(user, pw)
|
||||||
|
)
|
||||||
|
except requests.exceptions.ConnectionError as error:
|
||||||
|
raise error
|
||||||
|
except OSError as error:
|
||||||
|
raise OSError('Unable to open certificate bundle {}. Check that Ansible Tower is running on Red Hat Enterprise Linux.'.format(verify)) from error # noqa
|
||||||
|
subs.raise_for_status()
|
||||||
|
|
||||||
|
for sub in subs.json():
|
||||||
|
resp = requests.get(
|
||||||
|
'/'.join([
|
||||||
|
host,
|
||||||
|
'subscription/owners/{}/pools/?match=*tower*'.format(sub['key'])
|
||||||
|
]),
|
||||||
|
verify=verify,
|
||||||
|
auth=(user, pw)
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
json.extend(resp.json())
|
||||||
|
return json
|
||||||
|
|
||||||
|
|
||||||
|
def get_satellite_subs(self, host, user, pw):
|
||||||
|
try:
|
||||||
|
verify = str(self.config.get("rhsm", "repo_ca_cert"))
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception('Unable to read rhsm config to get ca_cert location. {}'.format(str(e)))
|
||||||
|
verify = getattr(settings, 'REDHAT_CANDLEPIN_VERIFY', True)
|
||||||
|
json = []
|
||||||
|
try:
|
||||||
|
orgs = requests.get(
|
||||||
|
'/'.join([host, 'katello/api/organizations']),
|
||||||
|
verify=verify,
|
||||||
|
auth=(user, pw)
|
||||||
|
)
|
||||||
|
except requests.exceptions.ConnectionError as error:
|
||||||
|
raise error
|
||||||
|
except OSError as error:
|
||||||
|
raise OSError('Unable to open certificate bundle {}. Check that Ansible Tower is running on Red Hat Enterprise Linux.'.format(verify)) from error # noqa
|
||||||
|
orgs.raise_for_status()
|
||||||
|
|
||||||
|
for org in orgs.json()['results']:
|
||||||
|
resp = requests.get(
|
||||||
|
'/'.join([
|
||||||
|
host,
|
||||||
|
'/katello/api/organizations/{}/subscriptions/?search=Red Hat Ansible Automation'.format(org['id'])
|
||||||
|
]),
|
||||||
|
verify=verify,
|
||||||
|
auth=(user, pw)
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
results = resp.json()['results']
|
||||||
|
if results != []:
|
||||||
|
for sub in results:
|
||||||
|
# Parse output for subscription metadata to build config
|
||||||
|
license = dict()
|
||||||
|
license['productId'] = sub['product_id']
|
||||||
|
license['quantity'] = int(sub['quantity'])
|
||||||
|
license['support_level'] = sub['support_level']
|
||||||
|
license['subscription_name'] = sub['name']
|
||||||
|
license['id'] = sub['upstream_pool_id']
|
||||||
|
license['endDate'] = sub['end_date']
|
||||||
|
license['productName'] = "Red Hat Ansible Automation"
|
||||||
|
license['valid_key'] = True
|
||||||
|
license['license_type'] = 'enterprise'
|
||||||
|
license['satellite'] = True
|
||||||
|
json.append(license)
|
||||||
|
return json
|
||||||
|
|
||||||
|
|
||||||
|
def is_appropriate_sat_sub(self, sub):
|
||||||
|
if 'Red Hat Ansible Automation' not in sub['subscription_name']:
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def is_appropriate_sub(self, sub):
|
||||||
|
if sub['activeSubscription'] is False:
|
||||||
|
return False
|
||||||
|
# Products that contain Ansible Tower
|
||||||
|
products = sub.get('providedProducts', [])
|
||||||
|
if any(map(lambda product: product.get('productId', None) == "480", products)):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def generate_license_options_from_entitlements(self, json):
|
||||||
|
from dateutil.parser import parse
|
||||||
|
ValidSub = collections.namedtuple('ValidSub', 'sku name support_level end_date trial quantity pool_id satellite')
|
||||||
|
valid_subs = []
|
||||||
|
for sub in json:
|
||||||
|
satellite = sub.get('satellite')
|
||||||
|
if satellite:
|
||||||
|
is_valid = self.is_appropriate_sat_sub(sub)
|
||||||
|
else:
|
||||||
|
is_valid = self.is_appropriate_sub(sub)
|
||||||
|
if is_valid:
|
||||||
|
try:
|
||||||
|
end_date = parse(sub.get('endDate'))
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
now = datetime.utcnow()
|
||||||
|
now = now.replace(tzinfo=end_date.tzinfo)
|
||||||
|
if end_date < now:
|
||||||
|
# If the sub has a past end date, skip it
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
quantity = int(sub['quantity'])
|
||||||
|
if quantity == -1:
|
||||||
|
# effectively, unlimited
|
||||||
|
quantity = MAX_INSTANCES
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
|
||||||
|
sku = sub['productId']
|
||||||
|
trial = sku.startswith('S') # i.e.,, SER/SVC
|
||||||
|
support_level = ''
|
||||||
|
pool_id = sub['id']
|
||||||
|
if satellite:
|
||||||
|
support_level = sub['support_level']
|
||||||
|
else:
|
||||||
|
for attr in sub.get('productAttributes', []):
|
||||||
|
if attr.get('name') == 'support_level':
|
||||||
|
support_level = attr.get('value')
|
||||||
|
|
||||||
|
valid_subs.append(ValidSub(
|
||||||
|
sku, sub['productName'], support_level, end_date, trial, quantity, pool_id, satellite
|
||||||
|
))
|
||||||
|
|
||||||
|
if valid_subs:
|
||||||
|
licenses = []
|
||||||
|
for sub in valid_subs:
|
||||||
|
license = self.__class__(subscription_name='Red Hat Ansible Automation Platform')
|
||||||
|
license._attrs['instance_count'] = int(sub.quantity)
|
||||||
|
license._attrs['sku'] = sub.sku
|
||||||
|
license._attrs['support_level'] = sub.support_level
|
||||||
|
license._attrs['license_type'] = 'enterprise'
|
||||||
|
if sub.trial:
|
||||||
|
license._attrs['trial'] = True
|
||||||
|
license._attrs['license_type'] = 'trial'
|
||||||
|
license._attrs['instance_count'] = min(
|
||||||
|
MAX_INSTANCES, license._attrs['instance_count']
|
||||||
|
)
|
||||||
|
human_instances = license._attrs['instance_count']
|
||||||
|
if human_instances == MAX_INSTANCES:
|
||||||
|
human_instances = 'Unlimited'
|
||||||
|
subscription_name = re.sub(
|
||||||
|
r' \([\d]+ Managed Nodes',
|
||||||
|
' ({} Managed Nodes'.format(human_instances),
|
||||||
|
sub.name
|
||||||
|
)
|
||||||
|
license._attrs['subscription_name'] = subscription_name
|
||||||
|
license._attrs['satellite'] = satellite
|
||||||
|
license._attrs['valid_key'] = True
|
||||||
|
license.update(
|
||||||
|
license_date=int(sub.end_date.strftime('%s'))
|
||||||
|
)
|
||||||
|
license.update(
|
||||||
|
pool_id=sub.pool_id
|
||||||
|
)
|
||||||
|
licenses.append(license._attrs.copy())
|
||||||
|
return licenses
|
||||||
|
|
||||||
|
raise ValueError(
|
||||||
|
'No valid Red Hat Ansible Automation subscription could be found for this account.' # noqa
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def validate(self):
|
||||||
|
# Return license attributes with additional validation info.
|
||||||
|
attrs = copy.deepcopy(self._attrs)
|
||||||
|
type = attrs.get('license_type', 'none')
|
||||||
|
|
||||||
|
if (type == 'UNLICENSED' or False):
|
||||||
|
attrs.update(dict(valid_key=False, compliant=False))
|
||||||
|
return attrs
|
||||||
|
attrs['valid_key'] = True
|
||||||
|
|
||||||
|
if Host:
|
||||||
|
current_instances = Host.objects.active_count()
|
||||||
|
else:
|
||||||
|
current_instances = 0
|
||||||
|
available_instances = int(attrs.get('instance_count', None) or 0)
|
||||||
|
attrs['current_instances'] = current_instances
|
||||||
|
attrs['available_instances'] = available_instances
|
||||||
|
free_instances = (available_instances - current_instances)
|
||||||
|
attrs['free_instances'] = max(0, free_instances)
|
||||||
|
|
||||||
|
license_date = int(attrs.get('license_date', 0) or 0)
|
||||||
|
current_date = int(time.time())
|
||||||
|
time_remaining = license_date - current_date
|
||||||
|
attrs['time_remaining'] = time_remaining
|
||||||
|
if attrs.setdefault('trial', False):
|
||||||
|
attrs['grace_period_remaining'] = time_remaining
|
||||||
|
else:
|
||||||
|
attrs['grace_period_remaining'] = (license_date + 2592000) - current_date
|
||||||
|
attrs['compliant'] = bool(time_remaining > 0 and free_instances >= 0)
|
||||||
|
attrs['date_warning'] = bool(time_remaining < self.SUBSCRIPTION_TIMEOUT)
|
||||||
|
attrs['date_expired'] = bool(time_remaining <= 0)
|
||||||
|
return attrs
|
||||||
151
awx/main/utils/profiling.py
Normal file
@@ -0,0 +1,151 @@
|
|||||||
|
import cProfile
|
||||||
|
import functools
|
||||||
|
import pstats
|
||||||
|
import os
|
||||||
|
import uuid
|
||||||
|
import datetime
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
class AWXProfileBase:
|
||||||
|
def __init__(self, name, dest):
|
||||||
|
self.name = name
|
||||||
|
self.dest = dest
|
||||||
|
self.results = {}
|
||||||
|
|
||||||
|
def generate_results(self):
|
||||||
|
raise RuntimeError("define me")
|
||||||
|
|
||||||
|
def output_results(self, fname=None):
|
||||||
|
if not os.path.isdir(self.dest):
|
||||||
|
os.makedirs(self.dest)
|
||||||
|
|
||||||
|
if fname:
|
||||||
|
fpath = os.path.join(self.dest, fname)
|
||||||
|
with open(fpath, 'w') as f:
|
||||||
|
f.write(json.dumps(self.results, indent=2))
|
||||||
|
|
||||||
|
|
||||||
|
class AWXTiming(AWXProfileBase):
|
||||||
|
def __init__(self, name, dest='/var/log/tower/timing'):
|
||||||
|
super().__init__(name, dest)
|
||||||
|
|
||||||
|
self.time_start = None
|
||||||
|
self.time_end = None
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
self.time_start = datetime.datetime.now()
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
self.time_end = datetime.datetime.now()
|
||||||
|
|
||||||
|
self.generate_results()
|
||||||
|
self.output_results()
|
||||||
|
|
||||||
|
def generate_results(self):
|
||||||
|
diff = (self.time_end - self.time_start).total_seconds()
|
||||||
|
self.results = {
|
||||||
|
'name': self.name,
|
||||||
|
'diff': f'{diff}-seconds',
|
||||||
|
}
|
||||||
|
|
||||||
|
def output_results(self):
|
||||||
|
fname = f"{self.results['diff']}-{self.name}-{uuid.uuid4()}.time"
|
||||||
|
super().output_results(fname)
|
||||||
|
|
||||||
|
|
||||||
|
def timing(name, *init_args, **init_kwargs):
|
||||||
|
def decorator_profile(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
def wrapper_profile(*args, **kwargs):
|
||||||
|
timing = AWXTiming(name, *init_args, **init_kwargs)
|
||||||
|
timing.start()
|
||||||
|
res = func(*args, **kwargs)
|
||||||
|
timing.stop()
|
||||||
|
return res
|
||||||
|
return wrapper_profile
|
||||||
|
return decorator_profile
|
||||||
|
|
||||||
|
|
||||||
|
class AWXProfiler(AWXProfileBase):
|
||||||
|
def __init__(self, name, dest='/var/log/tower/profile', dot_enabled=True):
|
||||||
|
'''
|
||||||
|
Try to do as little as possible in init. Instead, do the init
|
||||||
|
only when the profiling is started.
|
||||||
|
'''
|
||||||
|
super().__init__(name, dest)
|
||||||
|
self.started = False
|
||||||
|
self.dot_enabled = dot_enabled
|
||||||
|
self.results = {
|
||||||
|
'total_time_seconds': 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
def generate_results(self):
|
||||||
|
self.results['total_time_seconds'] = pstats.Stats(self.prof).total_tt
|
||||||
|
|
||||||
|
def output_results(self):
|
||||||
|
super().output_results()
|
||||||
|
|
||||||
|
filename_base = '%.3fs-%s-%s-%s' % (self.results['total_time_seconds'], self.name, self.pid, uuid.uuid4())
|
||||||
|
pstats_filepath = os.path.join(self.dest, f"{filename_base}.pstats")
|
||||||
|
extra_data = ""
|
||||||
|
|
||||||
|
if self.dot_enabled:
|
||||||
|
try:
|
||||||
|
from gprof2dot import main as generate_dot
|
||||||
|
except ImportError:
|
||||||
|
extra_data = 'Dot graph generation failed due to package "gprof2dot" being unavailable.'
|
||||||
|
else:
|
||||||
|
raw_filepath = os.path.join(self.dest, f"{filename_base}.raw")
|
||||||
|
dot_filepath = os.path.join(self.dest, f"{filename_base}.dot")
|
||||||
|
|
||||||
|
pstats.Stats(self.prof).dump_stats(raw_filepath)
|
||||||
|
generate_dot([
|
||||||
|
'-n', '2.5', '-f', 'pstats', '-o',
|
||||||
|
dot_filepath,
|
||||||
|
raw_filepath
|
||||||
|
])
|
||||||
|
os.remove(raw_filepath)
|
||||||
|
|
||||||
|
with open(pstats_filepath, 'w') as f:
|
||||||
|
print(f"{self.name}, {extra_data}", file=f)
|
||||||
|
pstats.Stats(self.prof, stream=f).sort_stats('cumulative').print_stats()
|
||||||
|
return pstats_filepath
|
||||||
|
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
self.prof = cProfile.Profile()
|
||||||
|
self.pid = os.getpid()
|
||||||
|
|
||||||
|
self.prof.enable()
|
||||||
|
self.started = True
|
||||||
|
|
||||||
|
def is_started(self):
|
||||||
|
return self.started
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
if self.started:
|
||||||
|
self.prof.disable()
|
||||||
|
|
||||||
|
self.generate_results()
|
||||||
|
res = self.output_results()
|
||||||
|
self.started = False
|
||||||
|
return res
|
||||||
|
else:
|
||||||
|
print("AWXProfiler::stop() called without calling start() first", file=sys.stderr)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def profile(name, *init_args, **init_kwargs):
|
||||||
|
def decorator_profile(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
def wrapper_profile(*args, **kwargs):
|
||||||
|
prof = AWXProfiler(name, *init_args, **init_kwargs)
|
||||||
|
prof.start()
|
||||||
|
res = func(*args, **kwargs)
|
||||||
|
prof.stop()
|
||||||
|
return res
|
||||||
|
return wrapper_profile
|
||||||
|
return decorator_profile
|
||||||
|
|
||||||
@@ -159,23 +159,29 @@
|
|||||||
gather_facts: false
|
gather_facts: false
|
||||||
connection: local
|
connection: local
|
||||||
name: Install content with ansible-galaxy command if necessary
|
name: Install content with ansible-galaxy command if necessary
|
||||||
|
vars:
|
||||||
|
yaml_exts:
|
||||||
|
- {ext: .yml}
|
||||||
|
- {ext: .yaml}
|
||||||
tasks:
|
tasks:
|
||||||
|
|
||||||
- block:
|
- block:
|
||||||
- name: detect requirements.yml
|
- name: detect roles/requirements.(yml/yaml)
|
||||||
stat:
|
stat:
|
||||||
path: '{{project_path|quote}}/roles/requirements.yml'
|
path: "{{project_path|quote}}/roles/requirements{{ item.ext }}"
|
||||||
|
with_items: "{{ yaml_exts }}"
|
||||||
register: doesRequirementsExist
|
register: doesRequirementsExist
|
||||||
|
|
||||||
- name: fetch galaxy roles from requirements.yml
|
- name: fetch galaxy roles from requirements.(yml/yaml)
|
||||||
command: >
|
command: >
|
||||||
ansible-galaxy role install -r roles/requirements.yml
|
ansible-galaxy role install -r {{ item.stat.path }}
|
||||||
--roles-path {{projects_root}}/.__awx_cache/{{local_path}}/stage/requirements_roles
|
--roles-path {{projects_root}}/.__awx_cache/{{local_path}}/stage/requirements_roles
|
||||||
{{ ' -' + 'v' * ansible_verbosity if ansible_verbosity else '' }}
|
{{ ' -' + 'v' * ansible_verbosity if ansible_verbosity else '' }}
|
||||||
args:
|
args:
|
||||||
chdir: "{{project_path|quote}}"
|
chdir: "{{project_path|quote}}"
|
||||||
register: galaxy_result
|
register: galaxy_result
|
||||||
when: doesRequirementsExist.stat.exists
|
with_items: "{{ doesRequirementsExist.results }}"
|
||||||
|
when: item.stat.exists
|
||||||
changed_when: "'was installed successfully' in galaxy_result.stdout"
|
changed_when: "'was installed successfully' in galaxy_result.stdout"
|
||||||
environment:
|
environment:
|
||||||
ANSIBLE_FORCE_COLOR: false
|
ANSIBLE_FORCE_COLOR: false
|
||||||
@@ -186,20 +192,22 @@
|
|||||||
- install_roles
|
- install_roles
|
||||||
|
|
||||||
- block:
|
- block:
|
||||||
- name: detect collections/requirements.yml
|
- name: detect collections/requirements.(yml/yaml)
|
||||||
stat:
|
stat:
|
||||||
path: '{{project_path|quote}}/collections/requirements.yml'
|
path: "{{project_path|quote}}/collections/requirements{{ item.ext }}"
|
||||||
|
with_items: "{{ yaml_exts }}"
|
||||||
register: doesCollectionRequirementsExist
|
register: doesCollectionRequirementsExist
|
||||||
|
|
||||||
- name: fetch galaxy collections from collections/requirements.yml
|
- name: fetch galaxy collections from collections/requirements.(yml/yaml)
|
||||||
command: >
|
command: >
|
||||||
ansible-galaxy collection install -r collections/requirements.yml
|
ansible-galaxy collection install -r {{ item.stat.path }}
|
||||||
--collections-path {{projects_root}}/.__awx_cache/{{local_path}}/stage/requirements_collections
|
--collections-path {{projects_root}}/.__awx_cache/{{local_path}}/stage/requirements_collections
|
||||||
{{ ' -' + 'v' * ansible_verbosity if ansible_verbosity else '' }}
|
{{ ' -' + 'v' * ansible_verbosity if ansible_verbosity else '' }}
|
||||||
args:
|
args:
|
||||||
chdir: "{{project_path|quote}}"
|
chdir: "{{project_path|quote}}"
|
||||||
register: galaxy_collection_result
|
register: galaxy_collection_result
|
||||||
when: doesCollectionRequirementsExist.stat.exists
|
with_items: "{{ doesCollectionRequirementsExist.results }}"
|
||||||
|
when: item.stat.exists
|
||||||
changed_when: "'Installing ' in galaxy_collection_result.stdout"
|
changed_when: "'Installing ' in galaxy_collection_result.stdout"
|
||||||
environment:
|
environment:
|
||||||
ANSIBLE_FORCE_COLOR: false
|
ANSIBLE_FORCE_COLOR: false
|
||||||
|
|||||||
@@ -91,7 +91,6 @@ USE_L10N = True
|
|||||||
USE_TZ = True
|
USE_TZ = True
|
||||||
|
|
||||||
STATICFILES_DIRS = (
|
STATICFILES_DIRS = (
|
||||||
os.path.join(BASE_DIR, 'ui', 'static'),
|
|
||||||
os.path.join(BASE_DIR, 'ui_next', 'build', 'static'),
|
os.path.join(BASE_DIR, 'ui_next', 'build', 'static'),
|
||||||
os.path.join(BASE_DIR, 'static'),
|
os.path.join(BASE_DIR, 'static'),
|
||||||
)
|
)
|
||||||
@@ -249,8 +248,6 @@ TEMPLATES = [
|
|||||||
'django.template.context_processors.static',
|
'django.template.context_processors.static',
|
||||||
'django.template.context_processors.tz',
|
'django.template.context_processors.tz',
|
||||||
'django.contrib.messages.context_processors.messages',
|
'django.contrib.messages.context_processors.messages',
|
||||||
'awx.ui.context_processors.settings',
|
|
||||||
'awx.ui.context_processors.version',
|
|
||||||
'social_django.context_processors.backends',
|
'social_django.context_processors.backends',
|
||||||
'social_django.context_processors.login_redirect',
|
'social_django.context_processors.login_redirect',
|
||||||
],
|
],
|
||||||
|
|||||||
@@ -184,3 +184,6 @@ else:
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
AWX_CALLBACK_PROFILE = True
|
AWX_CALLBACK_PROFILE = True
|
||||||
|
|
||||||
|
if 'sqlite3' not in DATABASES['default']['ENGINE']: # noqa
|
||||||
|
DATABASES['default'].setdefault('OPTIONS', dict()).setdefault('application_name', f'{CLUSTER_HOST_ID}-{os.getpid()}-{" ".join(sys.argv)}'[:63]) # noqa
|
||||||
|
|||||||
@@ -102,6 +102,7 @@ except IOError:
|
|||||||
else:
|
else:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
# The below runs AFTER all of the custom settings are imported.
|
||||||
|
|
||||||
CELERYBEAT_SCHEDULE.update({ # noqa
|
CELERYBEAT_SCHEDULE.update({ # noqa
|
||||||
'isolated_heartbeat': {
|
'isolated_heartbeat': {
|
||||||
@@ -110,3 +111,5 @@ CELERYBEAT_SCHEDULE.update({ # noqa
|
|||||||
'options': {'expires': AWX_ISOLATED_PERIODIC_CHECK * 2}, # noqa
|
'options': {'expires': AWX_ISOLATED_PERIODIC_CHECK * 2}, # noqa
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
DATABASES['default'].setdefault('OPTIONS', dict()).setdefault('application_name', f'{CLUSTER_HOST_ID}-{os.getpid()}-{" ".join(sys.argv)}'[:63]) # noqa
|
||||||
|
|||||||
@@ -1,19 +0,0 @@
|
|||||||
Gruntfile.js
|
|
||||||
karma.*.js
|
|
||||||
webpack.*.js
|
|
||||||
nightwatch.*.js
|
|
||||||
|
|
||||||
etc
|
|
||||||
coverage
|
|
||||||
grunt-tasks
|
|
||||||
node_modules
|
|
||||||
po
|
|
||||||
static
|
|
||||||
templates
|
|
||||||
|
|
||||||
client/src/**/*.js
|
|
||||||
client/assets/**/*.js
|
|
||||||
test/spec/**/*.js
|
|
||||||
|
|
||||||
!client/src/app.start.js
|
|
||||||
!client/src/vendor.js
|
|
||||||
@@ -1,72 +0,0 @@
|
|||||||
const path = require('path');
|
|
||||||
|
|
||||||
module.exports = {
|
|
||||||
root: true,
|
|
||||||
extends: [
|
|
||||||
'airbnb-base'
|
|
||||||
],
|
|
||||||
plugins: [
|
|
||||||
'import',
|
|
||||||
'disable'
|
|
||||||
],
|
|
||||||
settings: {
|
|
||||||
'import/resolver': {
|
|
||||||
webpack: {
|
|
||||||
config: path.join(__dirname, 'build/webpack.development.js')
|
|
||||||
}
|
|
||||||
},
|
|
||||||
'eslint-plugin-disable': {
|
|
||||||
paths: {
|
|
||||||
import: ['**/build/*.js']
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
env: {
|
|
||||||
browser: true,
|
|
||||||
node: true
|
|
||||||
},
|
|
||||||
globals: {
|
|
||||||
angular: true,
|
|
||||||
d3: true,
|
|
||||||
$: true,
|
|
||||||
_: true,
|
|
||||||
codemirror: true,
|
|
||||||
jsyaml: true,
|
|
||||||
crypto: true
|
|
||||||
},
|
|
||||||
rules: {
|
|
||||||
'arrow-parens': 'off',
|
|
||||||
'comma-dangle': 'off',
|
|
||||||
indent: ['error', 4, {
|
|
||||||
SwitchCase: 1
|
|
||||||
}],
|
|
||||||
'max-len': ['error', {
|
|
||||||
code: 100,
|
|
||||||
ignoreStrings: true,
|
|
||||||
ignoreTemplateLiterals: true,
|
|
||||||
}],
|
|
||||||
'no-continue': 'off',
|
|
||||||
'no-debugger': 'off',
|
|
||||||
'no-mixed-operators': 'off',
|
|
||||||
'no-param-reassign': 'off',
|
|
||||||
'no-plusplus': 'off',
|
|
||||||
'no-underscore-dangle': 'off',
|
|
||||||
'no-use-before-define': 'off',
|
|
||||||
'no-multiple-empty-lines': ['error', { max: 1 }],
|
|
||||||
'object-curly-newline': 'off',
|
|
||||||
'space-before-function-paren': ['error', 'always'],
|
|
||||||
'no-trailing-spaces': ['error'],
|
|
||||||
'prefer-destructuring': ['error', {
|
|
||||||
'VariableDeclarator': {
|
|
||||||
'array': false,
|
|
||||||
'object': true
|
|
||||||
},
|
|
||||||
'AssignmentExpression': {
|
|
||||||
'array': false,
|
|
||||||
'object': true
|
|
||||||
}
|
|
||||||
}, {
|
|
||||||
'enforceForRenamedProperties': false
|
|
||||||
}]
|
|
||||||
}
|
|
||||||
};
|
|
||||||
@@ -1,49 +0,0 @@
|
|||||||
{
|
|
||||||
"browser": true,
|
|
||||||
"node": true,
|
|
||||||
"jquery": true,
|
|
||||||
"esnext": true,
|
|
||||||
"globalstrict": true,
|
|
||||||
"curly": true,
|
|
||||||
"immed": true,
|
|
||||||
"latedef": "nofunc",
|
|
||||||
"noarg": true,
|
|
||||||
"nonew": true,
|
|
||||||
"maxerr": 10000,
|
|
||||||
"notypeof": true,
|
|
||||||
"globals": {
|
|
||||||
"$ENV": true,
|
|
||||||
"require": true,
|
|
||||||
"global": true,
|
|
||||||
"beforeEach": false,
|
|
||||||
"inject": false,
|
|
||||||
"module": false,
|
|
||||||
"angular":false,
|
|
||||||
"alert":false,
|
|
||||||
"$AnsibleConfig":true,
|
|
||||||
"$basePath":true,
|
|
||||||
"jsyaml":false,
|
|
||||||
"_":false,
|
|
||||||
"d3":false,
|
|
||||||
"Donut3D":false,
|
|
||||||
"nv":false,
|
|
||||||
"it": false,
|
|
||||||
"xit": false,
|
|
||||||
"expect": false,
|
|
||||||
"context": false,
|
|
||||||
"describe": false,
|
|
||||||
"moment": false,
|
|
||||||
"spyOn": false,
|
|
||||||
"jasmine": false,
|
|
||||||
"dagre": false,
|
|
||||||
"crypto": false
|
|
||||||
},
|
|
||||||
"strict": false,
|
|
||||||
"quotmark": false,
|
|
||||||
"trailing": true,
|
|
||||||
"undef": true,
|
|
||||||
"unused": true,
|
|
||||||
"eqeqeq": true,
|
|
||||||
"indent": 4,
|
|
||||||
"newcap": false
|
|
||||||
}
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
progress=false
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
module.exports = function(grunt) {
|
|
||||||
// Load grunt tasks & configurations automatically from dir grunt/
|
|
||||||
require('load-grunt-tasks')(grunt);
|
|
||||||
// display task timings
|
|
||||||
require('time-grunt')(grunt);
|
|
||||||
|
|
||||||
var options = {
|
|
||||||
config: {
|
|
||||||
src: './grunt-tasks/*.js'
|
|
||||||
},
|
|
||||||
pkg: grunt.file.readJSON('package.json')
|
|
||||||
};
|
|
||||||
|
|
||||||
var configs = require('load-grunt-configs')(grunt, options);
|
|
||||||
|
|
||||||
// Project configuration.
|
|
||||||
grunt.initConfig(configs);
|
|
||||||
grunt.loadNpmTasks('grunt-newer');
|
|
||||||
grunt.loadNpmTasks('grunt-angular-gettext');
|
|
||||||
};
|
|
||||||
103
awx/ui/README.md
@@ -1,103 +0,0 @@
|
|||||||
# AWX UI
|
|
||||||
|
|
||||||
## Requirements
|
|
||||||
- node.js 10.x LTS
|
|
||||||
- npm >=6.x
|
|
||||||
- bzip2, gcc-c++, git, make
|
|
||||||
|
|
||||||
## Development
|
|
||||||
The API development server will need to be running. See [CONTRIBUTING.md](../../CONTRIBUTING.md).
|
|
||||||
|
|
||||||
```shell
|
|
||||||
# Build ui for the devel environment - reachable at https://localhost:8043
|
|
||||||
make ui-devel
|
|
||||||
|
|
||||||
# Alternatively, start the ui development server. While running, the ui will be reachable
|
|
||||||
# at https://localhost:3000 and updated automatically when code changes.
|
|
||||||
make ui-docker
|
|
||||||
|
|
||||||
# When using docker machine, use this command to start the ui development server instead.
|
|
||||||
DOCKER_MACHINE_NAME=default make ui-docker-machine
|
|
||||||
```
|
|
||||||
|
|
||||||
## Development with an external server
|
|
||||||
If you normally run awx on an external host/server (in this example, `awx.local`),
|
|
||||||
you'll need to reconfigure the webpack proxy slightly for `make ui-docker` to
|
|
||||||
work:
|
|
||||||
|
|
||||||
```javascript
|
|
||||||
/awx/settings/development.py
|
|
||||||
+
|
|
||||||
+CSRF_TRUSTED_ORIGINS = ['awx.local:8043']
|
|
||||||
|
|
||||||
awx/ui/build/webpack.watch.js
|
|
||||||
- host: '127.0.0.1',
|
|
||||||
+ host: '0.0.0.0',
|
|
||||||
+ disableHostCheck: true,
|
|
||||||
|
|
||||||
/awx/ui/package.json
|
|
||||||
@@ -7,7 +7,7 @@
|
|
||||||
"config": {
|
|
||||||
...
|
|
||||||
+ "django_host": "awx.local"
|
|
||||||
},
|
|
||||||
```
|
|
||||||
|
|
||||||
## Testing
|
|
||||||
```shell
|
|
||||||
# run linters
|
|
||||||
make jshint
|
|
||||||
|
|
||||||
# run unit tests
|
|
||||||
make ui-test-ci
|
|
||||||
|
|
||||||
# run e2e tests - see awx/ui/test/e2e for more information
|
|
||||||
npm --prefix awx/ui run e2e
|
|
||||||
```
|
|
||||||
**Note**: Unit tests are run on your host machine and not in the development containers.
|
|
||||||
|
|
||||||
## Adding dependencies
|
|
||||||
```shell
|
|
||||||
# add an exact development or build dependency
|
|
||||||
npm install --prefix awx/ui --save-dev --save-exact dev-package@1.2.3
|
|
||||||
|
|
||||||
# add an exact production dependency
|
|
||||||
npm install --prefix awx/ui --save --save-exact prod-package@1.23
|
|
||||||
|
|
||||||
# add the updated package.json and package-lock.json files to scm
|
|
||||||
git add awx/ui/package.json awx/ui/package-lock.json
|
|
||||||
```
|
|
||||||
|
|
||||||
## Removing dependencies
|
|
||||||
```shell
|
|
||||||
# remove a development or build dependency
|
|
||||||
npm uninstall --prefix awx/ui --save-dev dev-package
|
|
||||||
|
|
||||||
# remove a production dependency
|
|
||||||
npm uninstall --prefix awx/ui --save prod-package
|
|
||||||
```
|
|
||||||
|
|
||||||
## Building for Production
|
|
||||||
```shell
|
|
||||||
# built files are placed in awx/ui/static
|
|
||||||
make ui-release
|
|
||||||
```
|
|
||||||
|
|
||||||
## Internationalization
|
|
||||||
Application strings marked for translation are extracted and used to generate `.pot` files using the following command:
|
|
||||||
```shell
|
|
||||||
# extract strings and generate .pot files
|
|
||||||
make pot
|
|
||||||
```
|
|
||||||
To include the translations in the development environment, we compile them prior to building the ui:
|
|
||||||
```shell
|
|
||||||
# remove any prior ui builds
|
|
||||||
make clean-ui
|
|
||||||
|
|
||||||
# compile the .pot files to javascript files usable by the application
|
|
||||||
make languages
|
|
||||||
|
|
||||||
# build the ui with translations included
|
|
||||||
make ui-devel
|
|
||||||
```
|
|
||||||
**Note**: Python 3.6 is required to compile the `.pot` files.
|
|
||||||
@@ -2,3 +2,4 @@
|
|||||||
# All Rights Reserved.
|
# All Rights Reserved.
|
||||||
|
|
||||||
default_app_config = 'awx.ui.apps.UIConfig'
|
default_app_config = 'awx.ui.apps.UIConfig'
|
||||||
|
|
||||||
|
|||||||
@@ -7,3 +7,4 @@ class UIConfig(AppConfig):
|
|||||||
|
|
||||||
name = 'awx.ui'
|
name = 'awx.ui'
|
||||||
verbose_name = _('UI')
|
verbose_name = _('UI')
|
||||||
|
|
||||||
|
|||||||
@@ -1,235 +0,0 @@
|
|||||||
const path = require('path');
|
|
||||||
|
|
||||||
const webpack = require('webpack');
|
|
||||||
const CleanWebpackPlugin = require('clean-webpack-plugin');
|
|
||||||
const CopyWebpackPlugin = require('copy-webpack-plugin');
|
|
||||||
const HtmlWebpackPlugin = require('html-webpack-plugin');
|
|
||||||
const ExtractTextPlugin = require('extract-text-webpack-plugin');
|
|
||||||
|
|
||||||
const CLIENT_PATH = path.resolve(__dirname, '../client');
|
|
||||||
const LIB_PATH = path.join(CLIENT_PATH, 'lib');
|
|
||||||
const UI_PATH = path.resolve(__dirname, '..');
|
|
||||||
|
|
||||||
const ASSETS_PATH = path.join(CLIENT_PATH, 'assets');
|
|
||||||
const COMPONENTS_PATH = path.join(LIB_PATH, 'components');
|
|
||||||
const COVERAGE_PATH = path.join(UI_PATH, 'coverage');
|
|
||||||
const FEATURES_PATH = path.join(CLIENT_PATH, 'features');
|
|
||||||
const LANGUAGES_PATH = path.join(CLIENT_PATH, 'languages');
|
|
||||||
const MODELS_PATH = path.join(LIB_PATH, 'models');
|
|
||||||
const NODE_MODULES_PATH = path.join(UI_PATH, 'node_modules');
|
|
||||||
const SERVICES_PATH = path.join(LIB_PATH, 'services');
|
|
||||||
const SRC_PATH = path.join(CLIENT_PATH, 'src');
|
|
||||||
const STATIC_PATH = path.join(UI_PATH, 'static');
|
|
||||||
const TEST_PATH = path.join(UI_PATH, 'test');
|
|
||||||
const THEME_PATH = path.join(LIB_PATH, 'theme');
|
|
||||||
|
|
||||||
const APP_ENTRY = path.join(SRC_PATH, 'app.js');
|
|
||||||
const VENDOR_ENTRY = path.join(SRC_PATH, 'vendor.js');
|
|
||||||
const INDEX_ENTRY = path.join(CLIENT_PATH, 'index.template.ejs');
|
|
||||||
const INDEX_OUTPUT = path.join(UI_PATH, 'templates/ui/index.html');
|
|
||||||
const INSTALL_RUNNING_ENTRY = path.join(CLIENT_PATH, 'installing.template.ejs');
|
|
||||||
const INSTALL_RUNNING_OUTPUT = path.join(UI_PATH, 'templates/ui/installing.html');
|
|
||||||
const THEME_ENTRY = path.join(LIB_PATH, 'theme', 'index.less');
|
|
||||||
const OUTPUT = 'js/[name].[chunkhash].js';
|
|
||||||
const CHUNKS = ['vendor', 'app'];
|
|
||||||
|
|
||||||
const VENDOR = VENDOR_ENTRY;
|
|
||||||
const APP = [THEME_ENTRY, APP_ENTRY];
|
|
||||||
|
|
||||||
const base = {
|
|
||||||
entry: {
|
|
||||||
vendor: VENDOR,
|
|
||||||
app: APP
|
|
||||||
},
|
|
||||||
output: {
|
|
||||||
path: STATIC_PATH,
|
|
||||||
publicPath: '',
|
|
||||||
filename: OUTPUT
|
|
||||||
},
|
|
||||||
stats: {
|
|
||||||
children: false,
|
|
||||||
modules: false,
|
|
||||||
chunks: false,
|
|
||||||
excludeAssets: name => {
|
|
||||||
const chunkNames = `(${CHUNKS.join('|')})`;
|
|
||||||
const outputPattern = new RegExp(`${chunkNames}.[a-f0-9]+.(js|css)(|.map)$`, 'i');
|
|
||||||
|
|
||||||
return !outputPattern.test(name);
|
|
||||||
}
|
|
||||||
},
|
|
||||||
module: {
|
|
||||||
rules: [
|
|
||||||
{
|
|
||||||
test: /\.js$/,
|
|
||||||
use: {
|
|
||||||
loader: 'istanbul-instrumenter-loader',
|
|
||||||
options: { esModules: true }
|
|
||||||
},
|
|
||||||
enforce: 'pre',
|
|
||||||
include: [
|
|
||||||
/src\/network-ui\//
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
test: /\.js$/,
|
|
||||||
loader: 'babel-loader',
|
|
||||||
exclude: /node_modules/,
|
|
||||||
options: {
|
|
||||||
presets: [
|
|
||||||
['env', {
|
|
||||||
targets: {
|
|
||||||
browsers: ['last 2 versions']
|
|
||||||
}
|
|
||||||
}]
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
test: /\.css$/,
|
|
||||||
use: ExtractTextPlugin.extract({
|
|
||||||
use: {
|
|
||||||
loader: 'css-loader',
|
|
||||||
options: {
|
|
||||||
url: false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
},
|
|
||||||
{
|
|
||||||
test: /lib\/theme\/index.less$/,
|
|
||||||
use: ExtractTextPlugin.extract({
|
|
||||||
use: ['css-loader', 'less-loader']
|
|
||||||
})
|
|
||||||
},
|
|
||||||
{
|
|
||||||
test: /\.html$/,
|
|
||||||
use: ['ngtemplate-loader', 'html-loader'],
|
|
||||||
include: [
|
|
||||||
/lib\/components\//,
|
|
||||||
/features\//,
|
|
||||||
/src\//
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
test: /\.svg$/,
|
|
||||||
use: ['ngtemplate-loader', 'html-loader'],
|
|
||||||
include: [
|
|
||||||
/lib\/components\//,
|
|
||||||
/features\//,
|
|
||||||
/src\//
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
test: /\.json$/,
|
|
||||||
loader: 'json-loader',
|
|
||||||
exclude: /node_modules/
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
plugins: [
|
|
||||||
new webpack.ProvidePlugin({
|
|
||||||
jsyaml: 'js-yaml',
|
|
||||||
CodeMirror: 'codemirror',
|
|
||||||
jsonlint: 'codemirror.jsonlint'
|
|
||||||
}),
|
|
||||||
new ExtractTextPlugin('css/[name].[chunkhash].css'),
|
|
||||||
new CleanWebpackPlugin([STATIC_PATH, COVERAGE_PATH], {
|
|
||||||
root: UI_PATH,
|
|
||||||
verbose: false
|
|
||||||
}),
|
|
||||||
new CopyWebpackPlugin([
|
|
||||||
{
|
|
||||||
from: path.join(ASSETS_PATH, 'fontcustom/**/*'),
|
|
||||||
to: path.join(STATIC_PATH, 'fonts/'),
|
|
||||||
flatten: true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(NODE_MODULES_PATH, 'components-font-awesome/fonts/*'),
|
|
||||||
to: path.join(STATIC_PATH, 'fonts/'),
|
|
||||||
flatten: true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(ASSETS_PATH, 'custom-theme/images.new/*'),
|
|
||||||
to: path.join(STATIC_PATH, 'images/'),
|
|
||||||
flatten: true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(LANGUAGES_PATH, '*'),
|
|
||||||
to: path.join(STATIC_PATH, 'languages'),
|
|
||||||
flatten: true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: ASSETS_PATH,
|
|
||||||
to: path.join(STATIC_PATH, 'assets')
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(NODE_MODULES_PATH, 'angular-scheduler/lib/*.html'),
|
|
||||||
to: path.join(STATIC_PATH, 'lib'),
|
|
||||||
context: NODE_MODULES_PATH
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(NODE_MODULES_PATH, 'angular-tz-extensions/tz/data/*'),
|
|
||||||
to: path.join(STATIC_PATH, 'lib/'),
|
|
||||||
context: NODE_MODULES_PATH
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(SRC_PATH, '**/*.partial.html'),
|
|
||||||
to: path.join(STATIC_PATH, 'partials/'),
|
|
||||||
context: SRC_PATH
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(SRC_PATH, 'partials', '*.html'),
|
|
||||||
to: STATIC_PATH,
|
|
||||||
context: SRC_PATH
|
|
||||||
},
|
|
||||||
{
|
|
||||||
from: path.join(SRC_PATH, '*config.js'),
|
|
||||||
to: STATIC_PATH,
|
|
||||||
flatten: true
|
|
||||||
}
|
|
||||||
]),
|
|
||||||
new HtmlWebpackPlugin({
|
|
||||||
alwaysWriteToDisk: true,
|
|
||||||
template: INDEX_ENTRY,
|
|
||||||
filename: INDEX_OUTPUT,
|
|
||||||
inject: false,
|
|
||||||
chunks: CHUNKS,
|
|
||||||
chunksSortMode: chunk => (chunk.names[0] === 'vendor' ? -1 : 1)
|
|
||||||
}),
|
|
||||||
new HtmlWebpackPlugin({
|
|
||||||
alwaysWriteToDisk: true,
|
|
||||||
template: INSTALL_RUNNING_ENTRY,
|
|
||||||
filename: INSTALL_RUNNING_OUTPUT,
|
|
||||||
inject: false,
|
|
||||||
chunks: CHUNKS,
|
|
||||||
chunksSortMode: chunk => (chunk.names[0] === 'vendor' ? -1 : 1)
|
|
||||||
}),
|
|
||||||
],
|
|
||||||
resolve: {
|
|
||||||
alias: {
|
|
||||||
'~assets': ASSETS_PATH,
|
|
||||||
'~components': COMPONENTS_PATH,
|
|
||||||
'~features': FEATURES_PATH,
|
|
||||||
'~models': MODELS_PATH,
|
|
||||||
'~node_modules': NODE_MODULES_PATH,
|
|
||||||
'~services': SERVICES_PATH,
|
|
||||||
'~src': SRC_PATH,
|
|
||||||
'~test': TEST_PATH,
|
|
||||||
'~theme': THEME_PATH,
|
|
||||||
'~ui': UI_PATH,
|
|
||||||
d3$: '~node_modules/d3/d3.min.js',
|
|
||||||
'codemirror.jsonlint$': '~node_modules/codemirror/addon/lint/json-lint.js',
|
|
||||||
jquery: '~node_modules/jquery/dist/jquery.js',
|
|
||||||
'jquery-resize$': '~node_modules/javascript-detect-element-resize/jquery.resize.js',
|
|
||||||
select2$: '~node_modules/select2/dist/js/select2.full.min.js',
|
|
||||||
'js-yaml$': '~node_modules/js-yaml/dist/js-yaml.min.js',
|
|
||||||
'lr-infinite-scroll$': '~node_modules/lr-infinite-scroll/lrInfiniteScroll.js',
|
|
||||||
'angular-tz-extensions$': '~node_modules/angular-tz-extensions/lib/angular-tz-extensions.js',
|
|
||||||
'ng-toast-provider$': '~node_modules/ng-toast/src/scripts/provider.js',
|
|
||||||
'ng-toast-directives$': '~node_modules/ng-toast/src/scripts/directives.js',
|
|
||||||
'ng-toast$': '~node_modules/ng-toast/src/scripts/module.js'
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
module.exports = base;
|
|
||||||
@@ -1,9 +0,0 @@
|
|||||||
const merge = require('webpack-merge');
|
|
||||||
|
|
||||||
const base = require('./webpack.base');
|
|
||||||
|
|
||||||
const development = {
|
|
||||||
devtool: 'source-map'
|
|
||||||
};
|
|
||||||
|
|
||||||
module.exports = merge(base, development);
|
|
||||||
@@ -1,28 +0,0 @@
|
|||||||
const path = require('path');
|
|
||||||
|
|
||||||
const merge = require('webpack-merge');
|
|
||||||
const webpack = require('webpack');
|
|
||||||
const UglifyJSPlugin = require('uglifyjs-webpack-plugin');
|
|
||||||
const HtmlWebpackPlugin = require('html-webpack-plugin');
|
|
||||||
|
|
||||||
const base = require('./webpack.base');
|
|
||||||
|
|
||||||
const CLIENT_PATH = path.resolve(__dirname, '../client');
|
|
||||||
const UI_PATH = path.resolve(__dirname, '..');
|
|
||||||
const CHUNKS = ['vendor', 'app'];
|
|
||||||
|
|
||||||
const production = {
|
|
||||||
plugins: [
|
|
||||||
new UglifyJSPlugin({
|
|
||||||
compress: true,
|
|
||||||
mangle: false
|
|
||||||
}),
|
|
||||||
new webpack.DefinePlugin({
|
|
||||||
'process.env': {
|
|
||||||
NODE_ENV: JSON.stringify('production')
|
|
||||||
}
|
|
||||||
})
|
|
||||||
]
|
|
||||||
};
|
|
||||||
|
|
||||||
module.exports = merge(base, production);
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
const _ = require('lodash');
|
|
||||||
const webpack = require('webpack');
|
|
||||||
|
|
||||||
const STATIC_URL = '/static/';
|
|
||||||
|
|
||||||
const development = require('./webpack.base');
|
|
||||||
|
|
||||||
const test = {
|
|
||||||
devtool: 'cheap-source-map',
|
|
||||||
plugins: [
|
|
||||||
new webpack.DefinePlugin({
|
|
||||||
$basePath: STATIC_URL
|
|
||||||
})
|
|
||||||
]
|
|
||||||
};
|
|
||||||
|
|
||||||
test.plugins = development.plugins.concat(test.plugins);
|
|
||||||
|
|
||||||
module.exports = _.merge(development, test);
|
|
||||||
|
|
||||||
@@ -1,84 +0,0 @@
|
|||||||
const path = require('path');
|
|
||||||
|
|
||||||
const _ = require('lodash');
|
|
||||||
const webpack = require('webpack');
|
|
||||||
const merge = require('webpack-merge');
|
|
||||||
const nodeObjectHash = require('node-object-hash');
|
|
||||||
const HardSourceWebpackPlugin = require('hard-source-webpack-plugin');
|
|
||||||
const HtmlWebpackHarddiskPlugin = require('html-webpack-harddisk-plugin');
|
|
||||||
|
|
||||||
const TARGET_PORT = _.get(process.env, 'npm_package_config_django_port', 8043);
|
|
||||||
const TARGET_HOST = _.get(process.env, 'npm_package_config_django_host', 'https://localhost');
|
|
||||||
const TARGET = `https://${TARGET_HOST}:${TARGET_PORT}`;
|
|
||||||
const OUTPUT = 'js/[name].js';
|
|
||||||
|
|
||||||
const development = require('./webpack.development');
|
|
||||||
|
|
||||||
const watch = {
|
|
||||||
cache: true,
|
|
||||||
devtool: 'cheap-source-map',
|
|
||||||
output: {
|
|
||||||
filename: OUTPUT
|
|
||||||
},
|
|
||||||
module: {
|
|
||||||
rules: [
|
|
||||||
{
|
|
||||||
test: /\.js$/,
|
|
||||||
enforce: 'pre',
|
|
||||||
exclude: /node_modules/,
|
|
||||||
loader: 'eslint-loader'
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
plugins: [
|
|
||||||
new HtmlWebpackHarddiskPlugin(),
|
|
||||||
new HardSourceWebpackPlugin({
|
|
||||||
cacheDirectory: 'node_modules/.cache/hard-source/[confighash]',
|
|
||||||
recordsPath: 'node_modules/.cache/hard-source/[confighash]/records.json',
|
|
||||||
configHash: config => nodeObjectHash({ sort: false }).hash(config),
|
|
||||||
environmentHash: {
|
|
||||||
root: process.cwd(),
|
|
||||||
directories: ['node_modules'],
|
|
||||||
files: ['package.json']
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
new webpack.HotModuleReplacementPlugin()
|
|
||||||
],
|
|
||||||
devServer: {
|
|
||||||
hot: true,
|
|
||||||
inline: true,
|
|
||||||
contentBase: path.resolve(__dirname, '..', 'static'),
|
|
||||||
stats: 'minimal',
|
|
||||||
publicPath: '/static/',
|
|
||||||
host: '127.0.0.1',
|
|
||||||
https: true,
|
|
||||||
port: 3000,
|
|
||||||
clientLogLevel: 'none',
|
|
||||||
proxy: [{
|
|
||||||
context: (pathname, req) => !(pathname === '/api/login/' && req.method === 'POST'),
|
|
||||||
target: TARGET,
|
|
||||||
secure: false,
|
|
||||||
ws: false,
|
|
||||||
bypass: req => req.originalUrl.includes('hot-update.json')
|
|
||||||
},
|
|
||||||
{
|
|
||||||
context: '/api/login/',
|
|
||||||
target: TARGET,
|
|
||||||
secure: false,
|
|
||||||
ws: false,
|
|
||||||
headers: {
|
|
||||||
Host: `localhost:${TARGET_PORT}`,
|
|
||||||
Origin: TARGET,
|
|
||||||
Referer: `${TARGET}/`
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
context: '/websocket',
|
|
||||||
target: TARGET,
|
|
||||||
secure: false,
|
|
||||||
ws: true
|
|
||||||
}]
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
module.exports = merge(development, watch);
|
|
||||||
@@ -1,5 +0,0 @@
|
|||||||
REVISION=588429
|
|
||||||
CHROMIUM_URL="https://storage.googleapis.com/chromium-browser-snapshots/Linux_x64/${REVISION}/chrome-linux.zip"
|
|
||||||
|
|
||||||
wget ${CHROMIUM_URL} -w 30 -t 6 -O /tmp/chrome-linux.zip
|
|
||||||
unzip -o -d /tmp /tmp/chrome-linux.zip
|
|
||||||
@@ -1,202 +0,0 @@
|
|||||||
|
|
||||||
Apache License
|
|
||||||
Version 2.0, January 2004
|
|
||||||
http://www.apache.org/licenses/
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
|
||||||
|
|
||||||
1. Definitions.
|
|
||||||
|
|
||||||
"License" shall mean the terms and conditions for use, reproduction,
|
|
||||||
and distribution as defined by Sections 1 through 9 of this document.
|
|
||||||
|
|
||||||
"Licensor" shall mean the copyright owner or entity authorized by
|
|
||||||
the copyright owner that is granting the License.
|
|
||||||
|
|
||||||
"Legal Entity" shall mean the union of the acting entity and all
|
|
||||||
other entities that control, are controlled by, or are under common
|
|
||||||
control with that entity. For the purposes of this definition,
|
|
||||||
"control" means (i) the power, direct or indirect, to cause the
|
|
||||||
direction or management of such entity, whether by contract or
|
|
||||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
|
||||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
|
||||||
|
|
||||||
"You" (or "Your") shall mean an individual or Legal Entity
|
|
||||||
exercising permissions granted by this License.
|
|
||||||
|
|
||||||
"Source" form shall mean the preferred form for making modifications,
|
|
||||||
including but not limited to software source code, documentation
|
|
||||||
source, and configuration files.
|
|
||||||
|
|
||||||
"Object" form shall mean any form resulting from mechanical
|
|
||||||
transformation or translation of a Source form, including but
|
|
||||||
not limited to compiled object code, generated documentation,
|
|
||||||
and conversions to other media types.
|
|
||||||
|
|
||||||
"Work" shall mean the work of authorship, whether in Source or
|
|
||||||
Object form, made available under the License, as indicated by a
|
|
||||||
copyright notice that is included in or attached to the work
|
|
||||||
(an example is provided in the Appendix below).
|
|
||||||
|
|
||||||
"Derivative Works" shall mean any work, whether in Source or Object
|
|
||||||
form, that is based on (or derived from) the Work and for which the
|
|
||||||
editorial revisions, annotations, elaborations, or other modifications
|
|
||||||
represent, as a whole, an original work of authorship. For the purposes
|
|
||||||
of this License, Derivative Works shall not include works that remain
|
|
||||||
separable from, or merely link (or bind by name) to the interfaces of,
|
|
||||||
the Work and Derivative Works thereof.
|
|
||||||
|
|
||||||
"Contribution" shall mean any work of authorship, including
|
|
||||||
the original version of the Work and any modifications or additions
|
|
||||||
to that Work or Derivative Works thereof, that is intentionally
|
|
||||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
|
||||||
or by an individual or Legal Entity authorized to submit on behalf of
|
|
||||||
the copyright owner. For the purposes of this definition, "submitted"
|
|
||||||
means any form of electronic, verbal, or written communication sent
|
|
||||||
to the Licensor or its representatives, including but not limited to
|
|
||||||
communication on electronic mailing lists, source code control systems,
|
|
||||||
and issue tracking systems that are managed by, or on behalf of, the
|
|
||||||
Licensor for the purpose of discussing and improving the Work, but
|
|
||||||
excluding communication that is conspicuously marked or otherwise
|
|
||||||
designated in writing by the copyright owner as "Not a Contribution."
|
|
||||||
|
|
||||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
|
||||||
on behalf of whom a Contribution has been received by Licensor and
|
|
||||||
subsequently incorporated within the Work.
|
|
||||||
|
|
||||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
copyright license to reproduce, prepare Derivative Works of,
|
|
||||||
publicly display, publicly perform, sublicense, and distribute the
|
|
||||||
Work and such Derivative Works in Source or Object form.
|
|
||||||
|
|
||||||
3. Grant of Patent License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
(except as stated in this section) patent license to make, have made,
|
|
||||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
|
||||||
where such license applies only to those patent claims licensable
|
|
||||||
by such Contributor that are necessarily infringed by their
|
|
||||||
Contribution(s) alone or by combination of their Contribution(s)
|
|
||||||
with the Work to which such Contribution(s) was submitted. If You
|
|
||||||
institute patent litigation against any entity (including a
|
|
||||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
|
||||||
or a Contribution incorporated within the Work constitutes direct
|
|
||||||
or contributory patent infringement, then any patent licenses
|
|
||||||
granted to You under this License for that Work shall terminate
|
|
||||||
as of the date such litigation is filed.
|
|
||||||
|
|
||||||
4. Redistribution. You may reproduce and distribute copies of the
|
|
||||||
Work or Derivative Works thereof in any medium, with or without
|
|
||||||
modifications, and in Source or Object form, provided that You
|
|
||||||
meet the following conditions:
|
|
||||||
|
|
||||||
(a) You must give any other recipients of the Work or
|
|
||||||
Derivative Works a copy of this License; and
|
|
||||||
|
|
||||||
(b) You must cause any modified files to carry prominent notices
|
|
||||||
stating that You changed the files; and
|
|
||||||
|
|
||||||
(c) You must retain, in the Source form of any Derivative Works
|
|
||||||
that You distribute, all copyright, patent, trademark, and
|
|
||||||
attribution notices from the Source form of the Work,
|
|
||||||
excluding those notices that do not pertain to any part of
|
|
||||||
the Derivative Works; and
|
|
||||||
|
|
||||||
(d) If the Work includes a "NOTICE" text file as part of its
|
|
||||||
distribution, then any Derivative Works that You distribute must
|
|
||||||
include a readable copy of the attribution notices contained
|
|
||||||
within such NOTICE file, excluding those notices that do not
|
|
||||||
pertain to any part of the Derivative Works, in at least one
|
|
||||||
of the following places: within a NOTICE text file distributed
|
|
||||||
as part of the Derivative Works; within the Source form or
|
|
||||||
documentation, if provided along with the Derivative Works; or,
|
|
||||||
within a display generated by the Derivative Works, if and
|
|
||||||
wherever such third-party notices normally appear. The contents
|
|
||||||
of the NOTICE file are for informational purposes only and
|
|
||||||
do not modify the License. You may add Your own attribution
|
|
||||||
notices within Derivative Works that You distribute, alongside
|
|
||||||
or as an addendum to the NOTICE text from the Work, provided
|
|
||||||
that such additional attribution notices cannot be construed
|
|
||||||
as modifying the License.
|
|
||||||
|
|
||||||
You may add Your own copyright statement to Your modifications and
|
|
||||||
may provide additional or different license terms and conditions
|
|
||||||
for use, reproduction, or distribution of Your modifications, or
|
|
||||||
for any such Derivative Works as a whole, provided Your use,
|
|
||||||
reproduction, and distribution of the Work otherwise complies with
|
|
||||||
the conditions stated in this License.
|
|
||||||
|
|
||||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
|
||||||
any Contribution intentionally submitted for inclusion in the Work
|
|
||||||
by You to the Licensor shall be under the terms and conditions of
|
|
||||||
this License, without any additional terms or conditions.
|
|
||||||
Notwithstanding the above, nothing herein shall supersede or modify
|
|
||||||
the terms of any separate license agreement you may have executed
|
|
||||||
with Licensor regarding such Contributions.
|
|
||||||
|
|
||||||
6. Trademarks. This License does not grant permission to use the trade
|
|
||||||
names, trademarks, service marks, or product names of the Licensor,
|
|
||||||
except as required for reasonable and customary use in describing the
|
|
||||||
origin of the Work and reproducing the content of the NOTICE file.
|
|
||||||
|
|
||||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
|
||||||
agreed to in writing, Licensor provides the Work (and each
|
|
||||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
|
||||||
implied, including, without limitation, any warranties or conditions
|
|
||||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
|
||||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
|
||||||
appropriateness of using or redistributing the Work and assume any
|
|
||||||
risks associated with Your exercise of permissions under this License.
|
|
||||||
|
|
||||||
8. Limitation of Liability. In no event and under no legal theory,
|
|
||||||
whether in tort (including negligence), contract, or otherwise,
|
|
||||||
unless required by applicable law (such as deliberate and grossly
|
|
||||||
negligent acts) or agreed to in writing, shall any Contributor be
|
|
||||||
liable to You for damages, including any direct, indirect, special,
|
|
||||||
incidental, or consequential damages of any character arising as a
|
|
||||||
result of this License or out of the use or inability to use the
|
|
||||||
Work (including but not limited to damages for loss of goodwill,
|
|
||||||
work stoppage, computer failure or malfunction, or any and all
|
|
||||||
other commercial damages or losses), even if such Contributor
|
|
||||||
has been advised of the possibility of such damages.
|
|
||||||
|
|
||||||
9. Accepting Warranty or Additional Liability. While redistributing
|
|
||||||
the Work or Derivative Works thereof, You may choose to offer,
|
|
||||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
|
||||||
or other liability obligations and/or rights consistent with this
|
|
||||||
License. However, in accepting such obligations, You may act only
|
|
||||||
on Your own behalf and on Your sole responsibility, not on behalf
|
|
||||||
of any other Contributor, and only if You agree to indemnify,
|
|
||||||
defend, and hold each Contributor harmless for any liability
|
|
||||||
incurred by, or claims asserted against, such Contributor by reason
|
|
||||||
of your accepting any such warranty or additional liability.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
APPENDIX: How to apply the Apache License to your work.
|
|
||||||
|
|
||||||
To apply the Apache License to your work, attach the following
|
|
||||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
|
||||||
replaced with your own identifying information. (Don't include
|
|
||||||
the brackets!) The text should be enclosed in the appropriate
|
|
||||||
comment syntax for the file format. We also recommend that a
|
|
||||||
file or class name and description of purpose be included on the
|
|
||||||
same "printed page" as the copyright notice for easier
|
|
||||||
identification within third-party archives.
|
|
||||||
|
|
||||||
Copyright [yyyy] [name of copyright owner]
|
|
||||||
|
|
||||||
Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
you may not use this file except in compliance with the License.
|
|
||||||
You may obtain a copy of the License at
|
|
||||||
|
|
||||||
http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
|
|
||||||
Unless required by applicable law or agreed to in writing, software
|
|
||||||
distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
See the License for the specific language governing permissions and
|
|
||||||
limitations under the License.
|
|
||||||
|
Before Width: | Height: | Size: 1.7 KiB |
|
Before Width: | Height: | Size: 212 B |
|
Before Width: | Height: | Size: 206 B |
|
Before Width: | Height: | Size: 206 B |
|
Before Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 206 B |
|
Before Width: | Height: | Size: 206 B |
|
Before Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 335 B |
|
Before Width: | Height: | Size: 326 B |
|
Before Width: | Height: | Size: 262 B |
|
Before Width: | Height: | Size: 332 B |
|
Before Width: | Height: | Size: 3.7 KiB |
|
Before Width: | Height: | Size: 372 B |