mirror of
https://github.com/ansible/awx.git
synced 2026-02-08 21:14:47 -03:30
Compare commits
193 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8a4c85e473 | ||
|
|
09d883f94a | ||
|
|
9ef57ec510 | ||
|
|
5be006f9d3 | ||
|
|
089bafa5d4 | ||
|
|
fa278f83ad | ||
|
|
0d68ca8f14 | ||
|
|
713079bd70 | ||
|
|
d3b137fbc4 | ||
|
|
5246c842b2 | ||
|
|
1dca4c9098 | ||
|
|
8cb32045f0 | ||
|
|
4962b729de | ||
|
|
ed39a127e7 | ||
|
|
c4b4a4c21a | ||
|
|
bd81fda05c | ||
|
|
83550eeba0 | ||
|
|
4540cb653e | ||
|
|
69597c5654 | ||
|
|
fa61aef194 | ||
|
|
e35f6b2acb | ||
|
|
a8140e86d7 | ||
|
|
4d4ae84e32 | ||
|
|
ae349addfe | ||
|
|
31fdd5e85c | ||
|
|
e4bde24f38 | ||
|
|
9c019e1cc0 | ||
|
|
b3d298269b | ||
|
|
21f7ca21e0 | ||
|
|
43bf370f8c | ||
|
|
6057921e34 | ||
|
|
d645d0894a | ||
|
|
4575cae458 | ||
|
|
6982a8aee7 | ||
|
|
fa1091d089 | ||
|
|
5095816762 | ||
|
|
c605705b39 | ||
|
|
ccc2a616c1 | ||
|
|
51184ba20d | ||
|
|
db33c0e4fa | ||
|
|
e9728f2a78 | ||
|
|
5cdf2f88da | ||
|
|
93e940adfc | ||
|
|
64776f97cf | ||
|
|
fc080732d4 | ||
|
|
d02364a833 | ||
|
|
176da040d9 | ||
|
|
f2b4d87152 | ||
|
|
17798edbc4 | ||
|
|
c1da74cbc0 | ||
|
|
5e6ee4a371 | ||
|
|
288fea8960 | ||
|
|
dca9daf719 | ||
|
|
634504c7a1 | ||
|
|
c019d873b9 | ||
|
|
e4a21b67c7 | ||
|
|
2e6c484a50 | ||
|
|
f8b64f2222 | ||
|
|
6060b62acd | ||
|
|
0dcf6a2b1f | ||
|
|
452c1b53f7 | ||
|
|
42d2f72683 | ||
|
|
57e8ba7f3c | ||
|
|
c882cda586 | ||
|
|
784d18705c | ||
|
|
36996584f9 | ||
|
|
0160dbe8bc | ||
|
|
28994d4b0b | ||
|
|
9b09344bae | ||
|
|
84ba383199 | ||
|
|
6dcd87afec | ||
|
|
243ab58902 | ||
|
|
6c877a15e3 | ||
|
|
2ccf0a0004 | ||
|
|
c69db02762 | ||
|
|
59e1c6d492 | ||
|
|
35c27c8b16 | ||
|
|
91edac0d84 | ||
|
|
ae1bd9d1e9 | ||
|
|
cf168b27d2 | ||
|
|
8cb7b388dc | ||
|
|
171f0d6340 | ||
|
|
aff31ac02f | ||
|
|
a23754897e | ||
|
|
3094b67664 | ||
|
|
98d3f3dc8a | ||
|
|
6f2a07a7df | ||
|
|
54ac1905b3 | ||
|
|
1bdae2d1f7 | ||
|
|
2bc2e26cc7 | ||
|
|
5010602e6b | ||
|
|
c103a813bf | ||
|
|
e097bc61c8 | ||
|
|
2ea63eeca0 | ||
|
|
52336c0fe8 | ||
|
|
220354241b | ||
|
|
1ae8fdc15c | ||
|
|
4bbdce3478 | ||
|
|
d25e6249fd | ||
|
|
71d7bac261 | ||
|
|
acba5306c6 | ||
|
|
fca9245536 | ||
|
|
47031da65b | ||
|
|
b024d91c66 | ||
|
|
da7002cf0c | ||
|
|
f4f1762805 | ||
|
|
ad5857e06b | ||
|
|
12d735ec8f | ||
|
|
1e9173e8ef | ||
|
|
4809c40f3c | ||
|
|
4e9ec271c5 | ||
|
|
6cd6a42e20 | ||
|
|
f234c0f771 | ||
|
|
3f49d2c455 | ||
|
|
a0fb9bef3a | ||
|
|
ccaaee61f0 | ||
|
|
70269d9a0d | ||
|
|
ab6322a8f7 | ||
|
|
8bc6367e1e | ||
|
|
b74bf9f266 | ||
|
|
321aa3b01d | ||
|
|
7f1096f711 | ||
|
|
2b6cfd7b3d | ||
|
|
b2b33605cc | ||
|
|
d06b0de74b | ||
|
|
6dfc714c75 | ||
|
|
cf5d3d55f0 | ||
|
|
e91d383165 | ||
|
|
72d19b93a0 | ||
|
|
ff1c96b0e0 | ||
|
|
6aaf906594 | ||
|
|
da7baced50 | ||
|
|
2b10c0f3f2 | ||
|
|
01788263e2 | ||
|
|
8daceabd26 | ||
|
|
712b07c136 | ||
|
|
8fbfed5c55 | ||
|
|
c4a3c0aac1 | ||
|
|
365f897059 | ||
|
|
7b1158ee8e | ||
|
|
d8814b7162 | ||
|
|
9af3fa557b | ||
|
|
e0d8d35090 | ||
|
|
7e83ddc968 | ||
|
|
bbbacd62ae | ||
|
|
a6fd3d0c09 | ||
|
|
edf0d4bf85 | ||
|
|
5ab09686c9 | ||
|
|
4ed4d85b91 | ||
|
|
e066b688fc | ||
|
|
15111dd24a | ||
|
|
31a96d20ab | ||
|
|
9a70ac88c0 | ||
|
|
2ec5dda1d8 | ||
|
|
dab80fb842 | ||
|
|
a6404bdd0d | ||
|
|
ee5199f77a | ||
|
|
7f409c6487 | ||
|
|
491e4c709e | ||
|
|
480c8516ab | ||
|
|
9eda4efb74 | ||
|
|
a517b15c26 | ||
|
|
609528e8a3 | ||
|
|
e17ee4b58f | ||
|
|
3dc8a10e85 | ||
|
|
e893017e00 | ||
|
|
4a1c121792 | ||
|
|
d39ad9d9ce | ||
|
|
07a5e17284 | ||
|
|
583d1390d2 | ||
|
|
638f8eae21 | ||
|
|
1d7bd835e6 | ||
|
|
4f90406e91 | ||
|
|
53b4dd5dbf | ||
|
|
491f4824b0 | ||
|
|
91721e09df | ||
|
|
2828d31141 | ||
|
|
d10e727b3c | ||
|
|
f57cf03f4b | ||
|
|
b319f47048 | ||
|
|
432daa6139 | ||
|
|
835c26f6cb | ||
|
|
f1c2a95f0d | ||
|
|
58e84a40e5 | ||
|
|
9c04e08b4d | ||
|
|
bda1abab8d | ||
|
|
8356327c2b | ||
|
|
cafac2338d | ||
|
|
e5dfc62dce | ||
|
|
11edd43af3 | ||
|
|
27d0111a27 | ||
|
|
58367811a0 | ||
|
|
0c0e172caf |
7
.gitignore
vendored
7
.gitignore
vendored
@@ -135,9 +135,10 @@ use_dev_supervisor.txt
|
||||
|
||||
|
||||
# Ansible module tests
|
||||
awx_collection_test_venv/
|
||||
awx_collection/*.tar.gz
|
||||
awx_collection/galaxy.yml
|
||||
/awx_collection_test_venv/
|
||||
/awx_collection/*.tar.gz
|
||||
/awx_collection/galaxy.yml
|
||||
/sanity/
|
||||
|
||||
.idea/*
|
||||
*.unison.tmp
|
||||
|
||||
@@ -120,6 +120,8 @@ If these variables are present then all deployments will use these hosted images
|
||||
|
||||
To complete a deployment to OpenShift, you will obviously need access to an OpenShift cluster. For demo and testing purposes, you can use [Minishift](https://github.com/minishift/minishift) to create a single node cluster running inside a virtual machine.
|
||||
|
||||
When using OpenShift for deploying AWX make sure you have correct privileges to add the security context 'privileged', otherwise the installation will fail. The privileged context is needed because of the use of [the bubblewrap tool](https://github.com/containers/bubblewrap) to add an additional layer of security when using containers.
|
||||
|
||||
You will also need to have the `oc` command in your PATH. The `install.yml` playbook will call out to `oc` when logging into, and creating objects on the cluster.
|
||||
|
||||
The default resource requests per-deployment requires:
|
||||
@@ -456,6 +458,10 @@ Before starting the build process, review the [inventory](./installer/inventory)
|
||||
|
||||
> When using docker-compose, the `docker-compose.yml` file will be created there (default `/tmp/awxcompose`).
|
||||
|
||||
*custom_venv_dir*
|
||||
|
||||
> Adds the custom venv environments from the local host to be passed into the containers at install.
|
||||
|
||||
*ca_trust_dir*
|
||||
|
||||
> If you're using a non trusted CA, provide a path where the untrusted Certs are stored on your Host.
|
||||
|
||||
10
Makefile
10
Makefile
@@ -196,7 +196,7 @@ requirements_awx_dev:
|
||||
|
||||
requirements: requirements_ansible requirements_awx
|
||||
|
||||
requirements_dev: requirements requirements_awx_dev requirements_ansible_dev
|
||||
requirements_dev: requirements_awx requirements_ansible_py3 requirements_awx_dev requirements_ansible_dev
|
||||
|
||||
requirements_test: requirements
|
||||
|
||||
@@ -381,7 +381,6 @@ test:
|
||||
prepare_collection_venv:
|
||||
rm -rf $(COLLECTION_VENV)
|
||||
mkdir $(COLLECTION_VENV)
|
||||
ln -s /usr/lib/python2.7/site-packages/ansible $(COLLECTION_VENV)/ansible
|
||||
$(VENV_BASE)/awx/bin/pip install --target=$(COLLECTION_VENV) git+https://github.com/ansible/tower-cli.git
|
||||
|
||||
COLLECTION_TEST_DIRS ?= awx_collection/test/awx
|
||||
@@ -399,6 +398,13 @@ flake8_collection:
|
||||
|
||||
test_collection_all: prepare_collection_venv test_collection flake8_collection
|
||||
|
||||
test_collection_sanity:
|
||||
rm -rf sanity
|
||||
mkdir -p sanity/ansible_collections/awx
|
||||
cp -Ra awx_collection sanity/ansible_collections/awx/awx # symlinks do not work
|
||||
cd sanity/ansible_collections/awx/awx && git init && git add . # requires both this file structure and a git repo, so there you go
|
||||
cd sanity/ansible_collections/awx/awx && ansible-test sanity --test validate-modules
|
||||
|
||||
build_collection:
|
||||
ansible-playbook -i localhost, awx_collection/template_galaxy.yml -e collection_package=$(COLLECTION_PACKAGE) -e collection_namespace=$(COLLECTION_NAMESPACE) -e collection_version=$(VERSION)
|
||||
ansible-galaxy collection build awx_collection --output-path=awx_collection
|
||||
|
||||
@@ -574,7 +574,7 @@ class SubListCreateAPIView(SubListAPIView, ListCreateAPIView):
|
||||
status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Verify we have permission to add the object as given.
|
||||
if not request.user.can_access(self.model, 'add', serializer.initial_data):
|
||||
if not request.user.can_access(self.model, 'add', serializer.validated_data):
|
||||
raise PermissionDenied()
|
||||
|
||||
# save the object through the serializer, reload and returned the saved
|
||||
|
||||
@@ -158,9 +158,16 @@ class Metadata(metadata.SimpleMetadata):
|
||||
isinstance(field, JSONField) or
|
||||
isinstance(model_field, JSONField) or
|
||||
isinstance(field, DRFJSONField) or
|
||||
isinstance(getattr(field, 'model_field', None), JSONField)
|
||||
isinstance(getattr(field, 'model_field', None), JSONField) or
|
||||
field.field_name == 'credential_passwords'
|
||||
):
|
||||
field_info['type'] = 'json'
|
||||
elif (
|
||||
isinstance(field, ManyRelatedField) and
|
||||
field.field_name == 'credentials'
|
||||
# launch-time credentials
|
||||
):
|
||||
field_info['type'] = 'list_of_ids'
|
||||
elif isinstance(model_field, BooleanField):
|
||||
field_info['type'] = 'boolean'
|
||||
|
||||
|
||||
@@ -4338,13 +4338,30 @@ class NotificationTemplateSerializer(BaseSerializer):
|
||||
error_list = []
|
||||
collected_messages = []
|
||||
|
||||
def check_messages(messages):
|
||||
for message_type in messages:
|
||||
if message_type not in ('message', 'body'):
|
||||
error_list.append(_("Message type '{}' invalid, must be either 'message' or 'body'").format(message_type))
|
||||
continue
|
||||
message = messages[message_type]
|
||||
if message is None:
|
||||
continue
|
||||
if not isinstance(message, str):
|
||||
error_list.append(_("Expected string for '{}', found {}, ").format(message_type, type(message)))
|
||||
continue
|
||||
if message_type == 'message':
|
||||
if '\n' in message:
|
||||
error_list.append(_("Messages cannot contain newlines (found newline in {} event)".format(event)))
|
||||
continue
|
||||
collected_messages.append(message)
|
||||
|
||||
# Validate structure / content types
|
||||
if not isinstance(messages, dict):
|
||||
error_list.append(_("Expected dict for 'messages' field, found {}".format(type(messages))))
|
||||
else:
|
||||
for event in messages:
|
||||
if event not in ['started', 'success', 'error']:
|
||||
error_list.append(_("Event '{}' invalid, must be one of 'started', 'success', or 'error'").format(event))
|
||||
if event not in ('started', 'success', 'error', 'workflow_approval'):
|
||||
error_list.append(_("Event '{}' invalid, must be one of 'started', 'success', 'error', or 'workflow_approval'").format(event))
|
||||
continue
|
||||
event_messages = messages[event]
|
||||
if event_messages is None:
|
||||
@@ -4352,21 +4369,21 @@ class NotificationTemplateSerializer(BaseSerializer):
|
||||
if not isinstance(event_messages, dict):
|
||||
error_list.append(_("Expected dict for event '{}', found {}").format(event, type(event_messages)))
|
||||
continue
|
||||
for message_type in event_messages:
|
||||
if message_type not in ['message', 'body']:
|
||||
error_list.append(_("Message type '{}' invalid, must be either 'message' or 'body'").format(message_type))
|
||||
continue
|
||||
message = event_messages[message_type]
|
||||
if message is None:
|
||||
continue
|
||||
if not isinstance(message, str):
|
||||
error_list.append(_("Expected string for '{}', found {}, ").format(message_type, type(message)))
|
||||
continue
|
||||
if message_type == 'message':
|
||||
if '\n' in message:
|
||||
error_list.append(_("Messages cannot contain newlines (found newline in {} event)".format(event)))
|
||||
if event == 'workflow_approval':
|
||||
for subevent in event_messages:
|
||||
if subevent not in ('running', 'approved', 'timed_out', 'denied'):
|
||||
error_list.append(_("Workflow Approval event '{}' invalid, must be one of "
|
||||
"'running', 'approved', 'timed_out', or 'denied'").format(subevent))
|
||||
continue
|
||||
collected_messages.append(message)
|
||||
subevent_messages = event_messages[subevent]
|
||||
if subevent_messages is None:
|
||||
continue
|
||||
if not isinstance(subevent_messages, dict):
|
||||
error_list.append(_("Expected dict for workflow approval event '{}', found {}").format(subevent, type(subevent_messages)))
|
||||
continue
|
||||
check_messages(subevent_messages)
|
||||
else:
|
||||
check_messages(event_messages)
|
||||
|
||||
# Subclass to return name of undefined field
|
||||
class DescriptiveUndefined(StrictUndefined):
|
||||
@@ -4497,8 +4514,18 @@ class NotificationSerializer(BaseSerializer):
|
||||
'notification_type', 'recipients', 'subject', 'body')
|
||||
|
||||
def get_body(self, obj):
|
||||
if obj.notification_type == 'webhook' and 'body' in obj.body:
|
||||
return obj.body['body']
|
||||
if obj.notification_type in ('webhook', 'pagerduty'):
|
||||
if isinstance(obj.body, dict):
|
||||
if 'body' in obj.body:
|
||||
return obj.body['body']
|
||||
elif isinstance(obj.body, str):
|
||||
# attempt to load json string
|
||||
try:
|
||||
potential_body = json.loads(obj.body)
|
||||
if isinstance(potential_body, dict):
|
||||
return potential_body
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
return obj.body
|
||||
|
||||
def get_related(self, obj):
|
||||
@@ -4774,6 +4801,18 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
raise serializers.ValidationError(_('Isolated instances may not be added or removed from instances groups via the API.'))
|
||||
if self.instance and self.instance.controller_id is not None:
|
||||
raise serializers.ValidationError(_('Isolated instance group membership may not be managed via the API.'))
|
||||
if value and self.instance and self.instance.is_containerized:
|
||||
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
|
||||
return value
|
||||
|
||||
def validate_policy_instance_percentage(self, value):
|
||||
if value and self.instance and self.instance.is_containerized:
|
||||
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
|
||||
return value
|
||||
|
||||
def validate_policy_instance_minimum(self, value):
|
||||
if value and self.instance and self.instance.is_containerized:
|
||||
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
|
||||
return value
|
||||
|
||||
def validate_name(self, value):
|
||||
|
||||
@@ -102,7 +102,7 @@ from awx.main.scheduler.dag_workflow import WorkflowDAG
|
||||
from awx.api.views.mixin import (
|
||||
ControlledByScmMixin, InstanceGroupMembershipMixin,
|
||||
OrganizationCountsMixin, RelatedJobsPreventDeleteMixin,
|
||||
UnifiedJobDeletionMixin,
|
||||
UnifiedJobDeletionMixin, NoTruncateMixin,
|
||||
)
|
||||
from awx.api.views.organization import ( # noqa
|
||||
OrganizationList,
|
||||
@@ -383,6 +383,13 @@ class InstanceGroupDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAP
|
||||
serializer_class = serializers.InstanceGroupSerializer
|
||||
permission_classes = (InstanceGroupTowerPermission,)
|
||||
|
||||
def update_raw_data(self, data):
|
||||
if self.get_object().is_containerized:
|
||||
data.pop('policy_instance_percentage', None)
|
||||
data.pop('policy_instance_minimum', None)
|
||||
data.pop('policy_instance_list', None)
|
||||
return super(InstanceGroupDetail, self).update_raw_data(data)
|
||||
|
||||
def destroy(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if instance.controller is not None:
|
||||
@@ -568,6 +575,7 @@ class TeamUsersList(BaseUsersList):
|
||||
serializer_class = serializers.UserSerializer
|
||||
parent_model = models.Team
|
||||
relationship = 'member_role.members'
|
||||
ordering = ('username',)
|
||||
|
||||
|
||||
class TeamRolesList(SubListAttachDetachAPIView):
|
||||
@@ -904,6 +912,7 @@ class UserList(ListCreateAPIView):
|
||||
model = models.User
|
||||
serializer_class = serializers.UserSerializer
|
||||
permission_classes = (UserPermission,)
|
||||
ordering = ('username',)
|
||||
|
||||
|
||||
class UserMeList(ListAPIView):
|
||||
@@ -911,6 +920,7 @@ class UserMeList(ListAPIView):
|
||||
model = models.User
|
||||
serializer_class = serializers.UserSerializer
|
||||
name = _('Me')
|
||||
ordering = ('username',)
|
||||
|
||||
def get_queryset(self):
|
||||
return self.model.objects.filter(pk=self.request.user.pk)
|
||||
@@ -1254,6 +1264,7 @@ class CredentialOwnerUsersList(SubListAPIView):
|
||||
serializer_class = serializers.UserSerializer
|
||||
parent_model = models.Credential
|
||||
relationship = 'admin_role.members'
|
||||
ordering = ('username',)
|
||||
|
||||
|
||||
class CredentialOwnerTeamsList(SubListAPIView):
|
||||
@@ -2136,12 +2147,21 @@ class InventorySourceHostsList(HostRelatedSearchMixin, SubListDestroyAPIView):
|
||||
def perform_list_destroy(self, instance_list):
|
||||
inv_source = self.get_parent_object()
|
||||
with ignore_inventory_computed_fields():
|
||||
# Activity stream doesn't record disassociation here anyway
|
||||
# no signals-related reason to not bulk-delete
|
||||
models.Host.groups.through.objects.filter(
|
||||
host__inventory_sources=inv_source
|
||||
).delete()
|
||||
r = super(InventorySourceHostsList, self).perform_list_destroy(instance_list)
|
||||
if not settings.ACTIVITY_STREAM_ENABLED_FOR_INVENTORY_SYNC:
|
||||
from awx.main.signals import disable_activity_stream
|
||||
with disable_activity_stream():
|
||||
# job host summary deletion necessary to avoid deadlock
|
||||
models.JobHostSummary.objects.filter(host__inventory_sources=inv_source).update(host=None)
|
||||
models.Host.objects.filter(inventory_sources=inv_source).delete()
|
||||
r = super(InventorySourceHostsList, self).perform_list_destroy([])
|
||||
else:
|
||||
# Advance delete of group-host memberships to prevent deadlock
|
||||
# Activity stream doesn't record disassociation here anyway
|
||||
# no signals-related reason to not bulk-delete
|
||||
models.Host.groups.through.objects.filter(
|
||||
host__inventory_sources=inv_source
|
||||
).delete()
|
||||
r = super(InventorySourceHostsList, self).perform_list_destroy(instance_list)
|
||||
update_inventory_computed_fields.delay(inv_source.inventory_id, True)
|
||||
return r
|
||||
|
||||
@@ -2157,11 +2177,18 @@ class InventorySourceGroupsList(SubListDestroyAPIView):
|
||||
def perform_list_destroy(self, instance_list):
|
||||
inv_source = self.get_parent_object()
|
||||
with ignore_inventory_computed_fields():
|
||||
# Same arguments for bulk delete as with host list
|
||||
models.Group.hosts.through.objects.filter(
|
||||
group__inventory_sources=inv_source
|
||||
).delete()
|
||||
r = super(InventorySourceGroupsList, self).perform_list_destroy(instance_list)
|
||||
if not settings.ACTIVITY_STREAM_ENABLED_FOR_INVENTORY_SYNC:
|
||||
from awx.main.signals import disable_activity_stream
|
||||
with disable_activity_stream():
|
||||
models.Group.objects.filter(inventory_sources=inv_source).delete()
|
||||
r = super(InventorySourceGroupsList, self).perform_list_destroy([])
|
||||
else:
|
||||
# Advance delete of group-host memberships to prevent deadlock
|
||||
# Same arguments for bulk delete as with host list
|
||||
models.Group.hosts.through.objects.filter(
|
||||
group__inventory_sources=inv_source
|
||||
).delete()
|
||||
r = super(InventorySourceGroupsList, self).perform_list_destroy(instance_list)
|
||||
update_inventory_computed_fields.delay(inv_source.inventory_id, True)
|
||||
return r
|
||||
|
||||
@@ -3762,18 +3789,12 @@ class JobHostSummaryDetail(RetrieveAPIView):
|
||||
serializer_class = serializers.JobHostSummarySerializer
|
||||
|
||||
|
||||
class JobEventList(ListAPIView):
|
||||
class JobEventList(NoTruncateMixin, ListAPIView):
|
||||
|
||||
model = models.JobEvent
|
||||
serializer_class = serializers.JobEventSerializer
|
||||
search_fields = ('stdout',)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
if self.request.query_params.get('no_truncate'):
|
||||
context.update(no_truncate=True)
|
||||
return context
|
||||
|
||||
|
||||
class JobEventDetail(RetrieveAPIView):
|
||||
|
||||
@@ -3786,7 +3807,7 @@ class JobEventDetail(RetrieveAPIView):
|
||||
return context
|
||||
|
||||
|
||||
class JobEventChildrenList(SubListAPIView):
|
||||
class JobEventChildrenList(NoTruncateMixin, SubListAPIView):
|
||||
|
||||
model = models.JobEvent
|
||||
serializer_class = serializers.JobEventSerializer
|
||||
@@ -3811,7 +3832,7 @@ class JobEventHostsList(HostRelatedSearchMixin, SubListAPIView):
|
||||
name = _('Job Event Hosts List')
|
||||
|
||||
|
||||
class BaseJobEventsList(SubListAPIView):
|
||||
class BaseJobEventsList(NoTruncateMixin, SubListAPIView):
|
||||
|
||||
model = models.JobEvent
|
||||
serializer_class = serializers.JobEventSerializer
|
||||
@@ -4007,18 +4028,12 @@ class AdHocCommandRelaunch(GenericAPIView):
|
||||
return Response(data, status=status.HTTP_201_CREATED, headers=headers)
|
||||
|
||||
|
||||
class AdHocCommandEventList(ListAPIView):
|
||||
class AdHocCommandEventList(NoTruncateMixin, ListAPIView):
|
||||
|
||||
model = models.AdHocCommandEvent
|
||||
serializer_class = serializers.AdHocCommandEventSerializer
|
||||
search_fields = ('stdout',)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
if self.request.query_params.get('no_truncate'):
|
||||
context.update(no_truncate=True)
|
||||
return context
|
||||
|
||||
|
||||
class AdHocCommandEventDetail(RetrieveAPIView):
|
||||
|
||||
@@ -4031,7 +4046,7 @@ class AdHocCommandEventDetail(RetrieveAPIView):
|
||||
return context
|
||||
|
||||
|
||||
class BaseAdHocCommandEventsList(SubListAPIView):
|
||||
class BaseAdHocCommandEventsList(NoTruncateMixin, SubListAPIView):
|
||||
|
||||
model = models.AdHocCommandEvent
|
||||
serializer_class = serializers.AdHocCommandEventSerializer
|
||||
@@ -4297,8 +4312,15 @@ class NotificationTemplateTest(GenericAPIView):
|
||||
|
||||
def post(self, request, *args, **kwargs):
|
||||
obj = self.get_object()
|
||||
notification = obj.generate_notification("Tower Notification Test {} {}".format(obj.id, settings.TOWER_URL_BASE),
|
||||
{"body": "Ansible Tower Test Notification {} {}".format(obj.id, settings.TOWER_URL_BASE)})
|
||||
msg = "Tower Notification Test {} {}".format(obj.id, settings.TOWER_URL_BASE)
|
||||
if obj.notification_type in ('email', 'pagerduty'):
|
||||
body = "Ansible Tower Test Notification {} {}".format(obj.id, settings.TOWER_URL_BASE)
|
||||
elif obj.notification_type == 'webhook':
|
||||
body = '{{"body": "Ansible Tower Test Notification {} {}"}}'.format(obj.id, settings.TOWER_URL_BASE)
|
||||
else:
|
||||
body = {"body": "Ansible Tower Test Notification {} {}".format(obj.id, settings.TOWER_URL_BASE)}
|
||||
notification = obj.generate_notification(msg, body)
|
||||
|
||||
if not notification:
|
||||
return Response({}, status=status.HTTP_400_BAD_REQUEST)
|
||||
else:
|
||||
|
||||
@@ -270,3 +270,11 @@ class ControlledByScmMixin(object):
|
||||
obj = super(ControlledByScmMixin, self).get_parent_object()
|
||||
self._reset_inv_src_rev(obj)
|
||||
return obj
|
||||
|
||||
|
||||
class NoTruncateMixin(object):
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
if self.request.query_params.get('no_truncate'):
|
||||
context.update(no_truncate=True)
|
||||
return context
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
from hashlib import sha1
|
||||
import hmac
|
||||
import json
|
||||
import logging
|
||||
import urllib.parse
|
||||
|
||||
@@ -151,13 +150,13 @@ class WebhookReceiverBase(APIView):
|
||||
'webhook_credential': obj.webhook_credential,
|
||||
'webhook_guid': event_guid,
|
||||
},
|
||||
'extra_vars': json.dumps({
|
||||
'extra_vars': {
|
||||
'tower_webhook_event_type': event_type,
|
||||
'tower_webhook_event_guid': event_guid,
|
||||
'tower_webhook_event_ref': event_ref,
|
||||
'tower_webhook_status_api': status_api,
|
||||
'tower_webhook_payload': request.data,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
new_job = obj.create_unified_job(**kwargs)
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
# Python
|
||||
import os
|
||||
import re
|
||||
import logging
|
||||
import urllib.parse as urlparse
|
||||
from collections import OrderedDict
|
||||
|
||||
# Django
|
||||
from django.core.validators import URLValidator
|
||||
from django.core.validators import URLValidator, _lazy_re_compile
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
@@ -118,17 +119,42 @@ class StringListPathField(StringListField):
|
||||
|
||||
|
||||
class URLField(CharField):
|
||||
# these lines set up a custom regex that allow numbers in the
|
||||
# top-level domain
|
||||
tld_re = (
|
||||
r'\.' # dot
|
||||
r'(?!-)' # can't start with a dash
|
||||
r'(?:[a-z' + URLValidator.ul + r'0-9' + '-]{2,63}' # domain label, this line was changed from the original URLValidator
|
||||
r'|xn--[a-z0-9]{1,59})' # or punycode label
|
||||
r'(?<!-)' # can't end with a dash
|
||||
r'\.?' # may have a trailing dot
|
||||
)
|
||||
|
||||
host_re = '(' + URLValidator.hostname_re + URLValidator.domain_re + tld_re + '|localhost)'
|
||||
|
||||
regex = _lazy_re_compile(
|
||||
r'^(?:[a-z0-9\.\-\+]*)://' # scheme is validated separately
|
||||
r'(?:[^\s:@/]+(?::[^\s:@/]*)?@)?' # user:pass authentication
|
||||
r'(?:' + URLValidator.ipv4_re + '|' + URLValidator.ipv6_re + '|' + host_re + ')'
|
||||
r'(?::\d{2,5})?' # port
|
||||
r'(?:[/?#][^\s]*)?' # resource path
|
||||
r'\Z', re.IGNORECASE)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
schemes = kwargs.pop('schemes', None)
|
||||
regex = kwargs.pop('regex', None)
|
||||
self.allow_plain_hostname = kwargs.pop('allow_plain_hostname', False)
|
||||
self.allow_numbers_in_top_level_domain = kwargs.pop('allow_numbers_in_top_level_domain', True)
|
||||
super(URLField, self).__init__(**kwargs)
|
||||
validator_kwargs = dict(message=_('Enter a valid URL'))
|
||||
if schemes is not None:
|
||||
validator_kwargs['schemes'] = schemes
|
||||
if regex is not None:
|
||||
validator_kwargs['regex'] = regex
|
||||
if self.allow_numbers_in_top_level_domain and regex is None:
|
||||
# default behavior is to allow numbers in the top level domain
|
||||
# if a custom regex isn't provided
|
||||
validator_kwargs['regex'] = URLField.regex
|
||||
self.validators.append(URLValidator(**validator_kwargs))
|
||||
|
||||
def to_representation(self, value):
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import pytest
|
||||
|
||||
from rest_framework.fields import ValidationError
|
||||
from awx.conf.fields import StringListBooleanField, StringListPathField, ListTuplesField
|
||||
from awx.conf.fields import StringListBooleanField, StringListPathField, ListTuplesField, URLField
|
||||
|
||||
|
||||
class TestStringListBooleanField():
|
||||
@@ -62,7 +62,7 @@ class TestListTuplesField():
|
||||
FIELD_VALUES = [
|
||||
([('a', 'b'), ('abc', '123')], [("a", "b"), ("abc", "123")]),
|
||||
]
|
||||
|
||||
|
||||
FIELD_VALUES_INVALID = [
|
||||
("abc", type("abc")),
|
||||
([('a', 'b', 'c'), ('abc', '123', '456')], type(('a',))),
|
||||
@@ -130,3 +130,25 @@ class TestStringListPathField():
|
||||
field.to_internal_value([value])
|
||||
assert e.value.detail[0] == "{} is not a valid path choice.".format(value)
|
||||
|
||||
|
||||
class TestURLField():
|
||||
regex = "^https://www.example.org$"
|
||||
|
||||
@pytest.mark.parametrize("url,schemes,regex, allow_numbers_in_top_level_domain, expect_no_error",[
|
||||
("ldap://www.example.org42", "ldap", None, True, True),
|
||||
("https://www.example.org42", "https", None, False, False),
|
||||
("https://www.example.org", None, regex, None, True),
|
||||
("https://www.example3.org", None, regex, None, False),
|
||||
("ftp://www.example.org", "https", None, None, False)
|
||||
])
|
||||
def test_urls(self, url, schemes, regex, allow_numbers_in_top_level_domain, expect_no_error):
|
||||
kwargs = {}
|
||||
kwargs.setdefault("allow_numbers_in_top_level_domain", allow_numbers_in_top_level_domain)
|
||||
kwargs.setdefault("schemes", schemes)
|
||||
kwargs.setdefault("regex", regex)
|
||||
field = URLField(**kwargs)
|
||||
if expect_no_error:
|
||||
field.run_validators(url)
|
||||
else:
|
||||
with pytest.raises(ValidationError):
|
||||
field.run_validators(url)
|
||||
|
||||
@@ -465,7 +465,7 @@ class BaseAccess(object):
|
||||
else:
|
||||
relationship = 'members'
|
||||
return access_method(obj, parent_obj, relationship, skip_sub_obj_read_check=True, data={})
|
||||
except (ParseError, ObjectDoesNotExist):
|
||||
except (ParseError, ObjectDoesNotExist, PermissionDenied):
|
||||
return False
|
||||
return False
|
||||
|
||||
@@ -1660,26 +1660,19 @@ class JobAccess(BaseAccess):
|
||||
except JobLaunchConfig.DoesNotExist:
|
||||
config = None
|
||||
|
||||
if obj.job_template and (self.user not in obj.job_template.execute_role):
|
||||
return False
|
||||
|
||||
# Check if JT execute access (and related prompts) is sufficient
|
||||
if obj.job_template is not None:
|
||||
if config is None:
|
||||
prompts_access = False
|
||||
elif not config.has_user_prompts(obj.job_template):
|
||||
prompts_access = True
|
||||
elif obj.created_by_id != self.user.pk and vars_are_encrypted(config.extra_data):
|
||||
prompts_access = False
|
||||
if self.save_messages:
|
||||
self.messages['detail'] = _('Job was launched with secret prompts provided by another user.')
|
||||
else:
|
||||
prompts_access = (
|
||||
JobLaunchConfigAccess(self.user).can_add({'reference_obj': config}) and
|
||||
not config.has_unprompted(obj.job_template)
|
||||
)
|
||||
jt_access = self.user in obj.job_template.execute_role
|
||||
if prompts_access and jt_access:
|
||||
if config and obj.job_template:
|
||||
if not config.has_user_prompts(obj.job_template):
|
||||
return True
|
||||
elif not jt_access:
|
||||
return False
|
||||
elif obj.created_by_id != self.user.pk and vars_are_encrypted(config.extra_data):
|
||||
# never allowed, not even for org admins
|
||||
raise PermissionDenied(_('Job was launched with secret prompts provided by another user.'))
|
||||
elif not config.has_unprompted(obj.job_template):
|
||||
if JobLaunchConfigAccess(self.user).can_add({'reference_obj': config}):
|
||||
return True
|
||||
|
||||
org_access = bool(obj.inventory) and self.user in obj.inventory.organization.inventory_admin_role
|
||||
project_access = obj.project is None or self.user in obj.project.admin_role
|
||||
@@ -2098,23 +2091,20 @@ class WorkflowJobAccess(BaseAccess):
|
||||
self.messages['detail'] = _('Workflow Job was launched with unknown prompts.')
|
||||
return False
|
||||
|
||||
# execute permission to WFJT is mandatory for any relaunch
|
||||
if self.user not in template.execute_role:
|
||||
return False
|
||||
|
||||
# Check if access to prompts to prevent relaunch
|
||||
if config.prompts_dict():
|
||||
if obj.created_by_id != self.user.pk and vars_are_encrypted(config.extra_data):
|
||||
if self.save_messages:
|
||||
self.messages['detail'] = _('Job was launched with secret prompts provided by another user.')
|
||||
return False
|
||||
raise PermissionDenied(_("Job was launched with secret prompts provided by another user."))
|
||||
if not JobLaunchConfigAccess(self.user).can_add({'reference_obj': config}):
|
||||
if self.save_messages:
|
||||
self.messages['detail'] = _('Job was launched with prompts you lack access to.')
|
||||
return False
|
||||
raise PermissionDenied(_('Job was launched with prompts you lack access to.'))
|
||||
if config.has_unprompted(template):
|
||||
if self.save_messages:
|
||||
self.messages['detail'] = _('Job was launched with prompts no longer accepted.')
|
||||
return False
|
||||
raise PermissionDenied(_('Job was launched with prompts no longer accepted.'))
|
||||
|
||||
# execute permission to WFJT is mandatory for any relaunch
|
||||
return (self.user in template.execute_role)
|
||||
return True # passed config checks
|
||||
|
||||
def can_recreate(self, obj):
|
||||
node_qs = obj.workflow_job_nodes.all().prefetch_related('inventory', 'credentials', 'unified_job_template')
|
||||
|
||||
@@ -54,15 +54,6 @@ register(
|
||||
category_slug='system',
|
||||
)
|
||||
|
||||
register(
|
||||
'TOWER_ADMIN_ALERTS',
|
||||
field_class=fields.BooleanField,
|
||||
label=_('Enable Administrator Alerts'),
|
||||
help_text=_('Email Admin users for system events that may require attention.'),
|
||||
category=_('System'),
|
||||
category_slug='system',
|
||||
)
|
||||
|
||||
register(
|
||||
'TOWER_URL_BASE',
|
||||
field_class=fields.URLField,
|
||||
@@ -513,6 +504,16 @@ register(
|
||||
category_slug='jobs'
|
||||
)
|
||||
|
||||
register(
|
||||
'PUBLIC_GALAXY_ENABLED',
|
||||
field_class=fields.BooleanField,
|
||||
default=True,
|
||||
label=_('Allow Access to Public Galaxy'),
|
||||
help_text=_('Allow or deny access to the public Ansible Galaxy during project updates.'),
|
||||
category=_('Jobs'),
|
||||
category_slug='jobs'
|
||||
)
|
||||
|
||||
register(
|
||||
'STDOUT_MAX_BYTES_DISPLAY',
|
||||
field_class=fields.IntegerField,
|
||||
|
||||
@@ -123,8 +123,16 @@ class PoolWorker(object):
|
||||
# if any tasks were finished, removed them from the managed tasks for
|
||||
# this worker
|
||||
for uuid in finished:
|
||||
self.messages_finished += 1
|
||||
del self.managed_tasks[uuid]
|
||||
try:
|
||||
del self.managed_tasks[uuid]
|
||||
self.messages_finished += 1
|
||||
except KeyError:
|
||||
# ansible _sometimes_ appears to send events w/ duplicate UUIDs;
|
||||
# UUIDs for ansible events are *not* actually globally unique
|
||||
# when this occurs, it's _fine_ to ignore this KeyError because
|
||||
# the purpose of self.managed_tasks is to just track internal
|
||||
# state of which events are *currently* being processed.
|
||||
pass
|
||||
|
||||
@property
|
||||
def current_task(self):
|
||||
|
||||
@@ -4,6 +4,7 @@ import importlib
|
||||
import sys
|
||||
import traceback
|
||||
|
||||
from kubernetes.config import kube_config
|
||||
|
||||
from awx.main.tasks import dispatch_startup, inform_cluster_of_shutdown
|
||||
|
||||
@@ -107,6 +108,14 @@ class TaskWorker(BaseWorker):
|
||||
for callback in body.get('errbacks', []) or []:
|
||||
callback['uuid'] = body['uuid']
|
||||
self.perform_work(callback)
|
||||
finally:
|
||||
# It's frustrating that we have to do this, but the python k8s
|
||||
# client leaves behind cacert files in /tmp, so we must clean up
|
||||
# the tmpdir per-dispatcher process every time a new task comes in
|
||||
try:
|
||||
kube_config._cleanup_temp_files()
|
||||
except Exception:
|
||||
logger.exception('failed to cleanup k8s client tmp files')
|
||||
|
||||
for callback in body.get('callbacks', []) or []:
|
||||
callback['uuid'] = body['uuid']
|
||||
|
||||
@@ -6,6 +6,7 @@ import stat
|
||||
import tempfile
|
||||
import time
|
||||
import logging
|
||||
import yaml
|
||||
|
||||
from django.conf import settings
|
||||
import ansible_runner
|
||||
@@ -48,10 +49,17 @@ class IsolatedManager(object):
|
||||
def build_inventory(self, hosts):
|
||||
if self.instance and self.instance.is_containerized:
|
||||
inventory = {'all': {'hosts': {}}}
|
||||
fd, path = tempfile.mkstemp(
|
||||
prefix='.kubeconfig', dir=self.private_data_dir
|
||||
)
|
||||
with open(path, 'wb') as temp:
|
||||
temp.write(yaml.dump(self.pod_manager.kube_config).encode())
|
||||
temp.flush()
|
||||
os.chmod(temp.name, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
|
||||
for host in hosts:
|
||||
inventory['all']['hosts'][host] = {
|
||||
"ansible_connection": "kubectl",
|
||||
"ansible_kubectl_config": self.pod_manager.kube_config
|
||||
"ansible_kubectl_config": path,
|
||||
}
|
||||
else:
|
||||
inventory = '\n'.join([
|
||||
@@ -143,6 +151,8 @@ class IsolatedManager(object):
|
||||
'- /artifacts/job_events/*-partial.json.tmp',
|
||||
# don't rsync the ssh_key FIFO
|
||||
'- /env/ssh_key',
|
||||
# don't rsync kube config files
|
||||
'- .kubeconfig*'
|
||||
]
|
||||
|
||||
for filename, data in (
|
||||
|
||||
@@ -295,7 +295,10 @@ class PrimordialModel(HasEditsMixin, CreatedModifiedModel):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
r = super(PrimordialModel, self).__init__(*args, **kwargs)
|
||||
self._prior_values_store = self._get_fields_snapshot()
|
||||
if self.pk:
|
||||
self._prior_values_store = self._get_fields_snapshot()
|
||||
else:
|
||||
self._prior_values_store = {}
|
||||
return r
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
|
||||
@@ -86,6 +86,7 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
|
||||
unique_together = (('organization', 'name', 'credential_type'))
|
||||
|
||||
PASSWORD_FIELDS = ['inputs']
|
||||
FIELDS_TO_PRESERVE_AT_COPY = ['input_sources']
|
||||
|
||||
credential_type = models.ForeignKey(
|
||||
'CredentialType',
|
||||
@@ -1162,6 +1163,8 @@ class CredentialInputSource(PrimordialModel):
|
||||
unique_together = (('target_credential', 'input_field_name'),)
|
||||
ordering = ('target_credential', 'source_credential', 'input_field_name',)
|
||||
|
||||
FIELDS_TO_PRESERVE_AT_COPY = ['source_credential', 'metadata', 'input_field_name']
|
||||
|
||||
target_credential = models.ForeignKey(
|
||||
'Credential',
|
||||
related_name='input_sources',
|
||||
|
||||
@@ -900,6 +900,9 @@ class LaunchTimeConfigBase(BaseModel):
|
||||
data[prompt_name] = self.display_extra_vars()
|
||||
else:
|
||||
data[prompt_name] = self.extra_vars
|
||||
# Depending on model, field type may save and return as string
|
||||
if isinstance(data[prompt_name], str):
|
||||
data[prompt_name] = parse_yaml_or_json(data[prompt_name])
|
||||
if self.survey_passwords and not display:
|
||||
data['survey_passwords'] = self.survey_passwords
|
||||
else:
|
||||
|
||||
@@ -73,7 +73,7 @@ class NotificationTemplate(CommonModelNameNotUnique):
|
||||
notification_configuration = prevent_search(JSONField(blank=False))
|
||||
|
||||
def default_messages():
|
||||
return {'started': None, 'success': None, 'error': None}
|
||||
return {'started': None, 'success': None, 'error': None, 'workflow_approval': None}
|
||||
|
||||
messages = JSONField(
|
||||
null=True,
|
||||
@@ -92,25 +92,6 @@ class NotificationTemplate(CommonModelNameNotUnique):
|
||||
def get_message(self, condition):
|
||||
return self.messages.get(condition, {})
|
||||
|
||||
def build_notification_message(self, event_type, context):
|
||||
env = sandbox.ImmutableSandboxedEnvironment()
|
||||
templates = self.get_message(event_type)
|
||||
msg_template = templates.get('message', {})
|
||||
|
||||
try:
|
||||
notification_subject = env.from_string(msg_template).render(**context)
|
||||
except (TemplateSyntaxError, UndefinedError, SecurityError):
|
||||
notification_subject = ''
|
||||
|
||||
|
||||
msg_body = templates.get('body', {})
|
||||
try:
|
||||
notification_body = env.from_string(msg_body).render(**context)
|
||||
except (TemplateSyntaxError, UndefinedError, SecurityError):
|
||||
notification_body = ''
|
||||
|
||||
return (notification_subject, notification_body)
|
||||
|
||||
def get_absolute_url(self, request=None):
|
||||
return reverse('api:notification_template_detail', kwargs={'pk': self.pk}, request=request)
|
||||
|
||||
@@ -128,19 +109,34 @@ class NotificationTemplate(CommonModelNameNotUnique):
|
||||
old_messages = old_nt.messages
|
||||
new_messages = self.messages
|
||||
|
||||
def merge_messages(local_old_messages, local_new_messages, local_event):
|
||||
if local_new_messages.get(local_event, {}) and local_old_messages.get(local_event, {}):
|
||||
local_old_event_msgs = local_old_messages[local_event]
|
||||
local_new_event_msgs = local_new_messages[local_event]
|
||||
for msg_type in ['message', 'body']:
|
||||
if msg_type not in local_new_event_msgs and local_old_event_msgs.get(msg_type, None):
|
||||
local_new_event_msgs[msg_type] = local_old_event_msgs[msg_type]
|
||||
if old_messages is not None and new_messages is not None:
|
||||
for event in ['started', 'success', 'error']:
|
||||
for event in ('started', 'success', 'error', 'workflow_approval'):
|
||||
if not new_messages.get(event, {}) and old_messages.get(event, {}):
|
||||
new_messages[event] = old_messages[event]
|
||||
continue
|
||||
if new_messages.get(event, {}) and old_messages.get(event, {}):
|
||||
old_event_msgs = old_messages[event]
|
||||
new_event_msgs = new_messages[event]
|
||||
for msg_type in ['message', 'body']:
|
||||
if msg_type not in new_event_msgs and old_event_msgs.get(msg_type, None):
|
||||
new_event_msgs[msg_type] = old_event_msgs[msg_type]
|
||||
|
||||
if event == 'workflow_approval' and old_messages.get('workflow_approval', None):
|
||||
new_messages.setdefault('workflow_approval', {})
|
||||
for subevent in ('running', 'approved', 'timed_out', 'denied'):
|
||||
old_wfa_messages = old_messages['workflow_approval']
|
||||
new_wfa_messages = new_messages['workflow_approval']
|
||||
if not new_wfa_messages.get(subevent, {}) and old_wfa_messages.get(subevent, {}):
|
||||
new_wfa_messages[subevent] = old_wfa_messages[subevent]
|
||||
continue
|
||||
if old_wfa_messages:
|
||||
merge_messages(old_wfa_messages, new_wfa_messages, subevent)
|
||||
else:
|
||||
merge_messages(old_messages, new_messages, event)
|
||||
new_messages.setdefault(event, None)
|
||||
|
||||
|
||||
for field in filter(lambda x: self.notification_class.init_parameters[x]['type'] == "password",
|
||||
self.notification_class.init_parameters):
|
||||
if self.notification_configuration[field].startswith("$encrypted$"):
|
||||
@@ -169,12 +165,12 @@ class NotificationTemplate(CommonModelNameNotUnique):
|
||||
def recipients(self):
|
||||
return self.notification_configuration[self.notification_class.recipient_parameter]
|
||||
|
||||
def generate_notification(self, subject, message):
|
||||
def generate_notification(self, msg, body):
|
||||
notification = Notification(notification_template=self,
|
||||
notification_type=self.notification_type,
|
||||
recipients=smart_str(self.recipients),
|
||||
subject=subject,
|
||||
body=message)
|
||||
subject=msg,
|
||||
body=body)
|
||||
notification.save()
|
||||
return notification
|
||||
|
||||
@@ -370,7 +366,7 @@ class JobNotificationMixin(object):
|
||||
'verbosity': 0},
|
||||
'job_friendly_name': 'Job',
|
||||
'url': 'https://towerhost/#/jobs/playbook/1010',
|
||||
'job_summary_dict': """{'url': 'https://towerhost/$/jobs/playbook/13',
|
||||
'job_metadata': """{'url': 'https://towerhost/$/jobs/playbook/13',
|
||||
'traceback': '',
|
||||
'status': 'running',
|
||||
'started': '2019-08-07T21:46:38.362630+00:00',
|
||||
@@ -389,14 +385,14 @@ class JobNotificationMixin(object):
|
||||
return context
|
||||
|
||||
def context(self, serialized_job):
|
||||
"""Returns a context that can be used for rendering notification messages.
|
||||
Context contains whitelisted content retrieved from a serialized job object
|
||||
"""Returns a dictionary that can be used for rendering notification messages.
|
||||
The context will contain whitelisted content retrieved from a serialized job object
|
||||
(see JobNotificationMixin.JOB_FIELDS_WHITELIST), the job's friendly name,
|
||||
and a url to the job run."""
|
||||
context = {'job': {},
|
||||
'job_friendly_name': self.get_notification_friendly_name(),
|
||||
'url': self.get_ui_url(),
|
||||
'job_summary_dict': json.dumps(self.notification_data(), indent=4)}
|
||||
'job_metadata': json.dumps(self.notification_data(), indent=4)}
|
||||
|
||||
def build_context(node, fields, whitelisted_fields):
|
||||
for safe_field in whitelisted_fields:
|
||||
@@ -434,32 +430,33 @@ class JobNotificationMixin(object):
|
||||
context = self.context(job_serialization)
|
||||
|
||||
msg_template = body_template = None
|
||||
msg = body = ''
|
||||
|
||||
# Use custom template if available
|
||||
if nt.messages:
|
||||
templates = nt.messages.get(self.STATUS_TO_TEMPLATE_TYPE[status], {}) or {}
|
||||
msg_template = templates.get('message', {})
|
||||
body_template = templates.get('body', {})
|
||||
template = nt.messages.get(self.STATUS_TO_TEMPLATE_TYPE[status], {}) or {}
|
||||
msg_template = template.get('message', None)
|
||||
body_template = template.get('body', None)
|
||||
# If custom template not provided, look up default template
|
||||
default_template = nt.notification_class.default_messages[self.STATUS_TO_TEMPLATE_TYPE[status]]
|
||||
if not msg_template:
|
||||
msg_template = default_template.get('message', None)
|
||||
if not body_template:
|
||||
body_template = default_template.get('body', None)
|
||||
|
||||
if msg_template:
|
||||
try:
|
||||
notification_subject = env.from_string(msg_template).render(**context)
|
||||
msg = env.from_string(msg_template).render(**context)
|
||||
except (TemplateSyntaxError, UndefinedError, SecurityError):
|
||||
notification_subject = ''
|
||||
else:
|
||||
notification_subject = u"{} #{} '{}' {}: {}".format(self.get_notification_friendly_name(),
|
||||
self.id,
|
||||
self.name,
|
||||
status,
|
||||
self.get_ui_url())
|
||||
notification_body = self.notification_data()
|
||||
notification_body['friendly_name'] = self.get_notification_friendly_name()
|
||||
msg = ''
|
||||
|
||||
if body_template:
|
||||
try:
|
||||
notification_body['body'] = env.from_string(body_template).render(**context)
|
||||
body = env.from_string(body_template).render(**context)
|
||||
except (TemplateSyntaxError, UndefinedError, SecurityError):
|
||||
notification_body['body'] = ''
|
||||
body = ''
|
||||
|
||||
return (notification_subject, notification_body)
|
||||
return (msg, body)
|
||||
|
||||
def send_notification_templates(self, status):
|
||||
from awx.main.tasks import send_notifications # avoid circular import
|
||||
@@ -475,16 +472,13 @@ class JobNotificationMixin(object):
|
||||
return
|
||||
|
||||
for nt in set(notification_templates.get(self.STATUS_TO_TEMPLATE_TYPE[status], [])):
|
||||
try:
|
||||
(notification_subject, notification_body) = self.build_notification_message(nt, status)
|
||||
except AttributeError:
|
||||
raise NotImplementedError("build_notification_message() does not exist" % status)
|
||||
(msg, body) = self.build_notification_message(nt, status)
|
||||
|
||||
# Use kwargs to force late-binding
|
||||
# https://stackoverflow.com/a/3431699/10669572
|
||||
def send_it(local_nt=nt, local_subject=notification_subject, local_body=notification_body):
|
||||
def send_it(local_nt=nt, local_msg=msg, local_body=body):
|
||||
def _func():
|
||||
send_notifications.delay([local_nt.generate_notification(local_subject, local_body).id],
|
||||
send_notifications.delay([local_nt.generate_notification(local_msg, local_body).id],
|
||||
job_id=self.id)
|
||||
return _func
|
||||
connection.on_commit(send_it())
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
# All Rights Reserved.
|
||||
|
||||
# Python
|
||||
import json
|
||||
import logging
|
||||
from copy import copy
|
||||
from urllib.parse import urljoin
|
||||
@@ -16,6 +17,9 @@ from django.core.exceptions import ObjectDoesNotExist
|
||||
# Django-CRUM
|
||||
from crum import get_current_user
|
||||
|
||||
from jinja2 import sandbox
|
||||
from jinja2.exceptions import TemplateSyntaxError, UndefinedError, SecurityError
|
||||
|
||||
# AWX
|
||||
from awx.api.versioning import reverse
|
||||
from awx.main.models import (prevent_search, accepts_json, UnifiedJobTemplate,
|
||||
@@ -763,22 +767,45 @@ class WorkflowApproval(UnifiedJob, JobNotificationMixin):
|
||||
connection.on_commit(send_it())
|
||||
|
||||
def build_approval_notification_message(self, nt, approval_status):
|
||||
subject = []
|
||||
workflow_url = urljoin(settings.TOWER_URL_BASE, '/#/workflows/{}'.format(self.workflow_job.id))
|
||||
subject.append(('The approval node "{}"').format(self.workflow_approval_template.name))
|
||||
if approval_status == 'running':
|
||||
subject.append(('needs review. This node can be viewed at: {}').format(workflow_url))
|
||||
if approval_status == 'approved':
|
||||
subject.append(('was approved. {}').format(workflow_url))
|
||||
if approval_status == 'timed_out':
|
||||
subject.append(('has timed out. {}').format(workflow_url))
|
||||
elif approval_status == 'denied':
|
||||
subject.append(('was denied. {}').format(workflow_url))
|
||||
subject = " ".join(subject)
|
||||
body = self.notification_data()
|
||||
body['body'] = subject
|
||||
env = sandbox.ImmutableSandboxedEnvironment()
|
||||
|
||||
return subject, body
|
||||
context = self.context(approval_status)
|
||||
|
||||
msg_template = body_template = None
|
||||
msg = body = ''
|
||||
|
||||
# Use custom template if available
|
||||
if nt.messages and nt.messages.get('workflow_approval', None):
|
||||
template = nt.messages['workflow_approval'].get(approval_status, {})
|
||||
msg_template = template.get('message', None)
|
||||
body_template = template.get('body', None)
|
||||
# If custom template not provided, look up default template
|
||||
default_template = nt.notification_class.default_messages['workflow_approval'][approval_status]
|
||||
if not msg_template:
|
||||
msg_template = default_template.get('message', None)
|
||||
if not body_template:
|
||||
body_template = default_template.get('body', None)
|
||||
|
||||
if msg_template:
|
||||
try:
|
||||
msg = env.from_string(msg_template).render(**context)
|
||||
except (TemplateSyntaxError, UndefinedError, SecurityError):
|
||||
msg = ''
|
||||
|
||||
if body_template:
|
||||
try:
|
||||
body = env.from_string(body_template).render(**context)
|
||||
except (TemplateSyntaxError, UndefinedError, SecurityError):
|
||||
body = ''
|
||||
|
||||
return (msg, body)
|
||||
|
||||
def context(self, approval_status):
|
||||
workflow_url = urljoin(settings.TOWER_URL_BASE, '/#/workflows/{}'.format(self.workflow_job.id))
|
||||
return {'approval_status': approval_status,
|
||||
'approval_node_name': self.workflow_approval_template.name,
|
||||
'workflow_url': workflow_url,
|
||||
'job_metadata': json.dumps(self.notification_data(), indent=4)}
|
||||
|
||||
@property
|
||||
def workflow_job_template(self):
|
||||
|
||||
@@ -1,21 +1,10 @@
|
||||
# Copyright (c) 2016 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
import json
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.core.mail.backends.base import BaseEmailBackend
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
|
||||
class AWXBaseEmailBackend(BaseEmailBackend):
|
||||
|
||||
def format_body(self, body):
|
||||
if "body" in body:
|
||||
body_actual = body['body']
|
||||
else:
|
||||
body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
|
||||
body['friendly_name'], body['id'], body['status'], body['url'])
|
||||
)
|
||||
body_actual += json.dumps(body, indent=4)
|
||||
return body_actual
|
||||
return body
|
||||
|
||||
20
awx/main/notifications/custom_notification_base.py
Normal file
20
awx/main/notifications/custom_notification_base.py
Normal file
@@ -0,0 +1,20 @@
|
||||
# Copyright (c) 2019 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
|
||||
class CustomNotificationBase(object):
|
||||
DEFAULT_MSG = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
DEFAULT_BODY = "{{ job_friendly_name }} #{{ job.id }} had status {{ job.status }}, view details at {{ url }}\n\n{{ job_metadata }}"
|
||||
|
||||
default_messages = {"started": {"message": DEFAULT_MSG, "body": None},
|
||||
"success": {"message": DEFAULT_MSG, "body": None},
|
||||
"error": {"message": DEFAULT_MSG, "body": None},
|
||||
"workflow_approval": {"running": {"message": 'The approval node "{{ approval_node_name }}" needs review. '
|
||||
'This node can be viewed at: {{ workflow_url }}',
|
||||
"body": None},
|
||||
"approved": {"message": 'The approval node "{{ approval_node_name }}" was approved. {{ workflow_url }}',
|
||||
"body": None},
|
||||
"timed_out": {"message": 'The approval node "{{ approval_node_name }}" has timed out. {{ workflow_url }}',
|
||||
"body": None},
|
||||
"denied": {"message": 'The approval node "{{ approval_node_name }}" was denied. {{ workflow_url }}',
|
||||
"body": None}}}
|
||||
@@ -1,14 +1,15 @@
|
||||
# Copyright (c) 2016 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
import json
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.core.mail.backends.smtp import EmailBackend
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
DEFAULT_MSG = CustomNotificationBase.DEFAULT_MSG
|
||||
DEFAULT_BODY = CustomNotificationBase.DEFAULT_BODY
|
||||
|
||||
|
||||
class CustomEmailBackend(EmailBackend):
|
||||
class CustomEmailBackend(EmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"host": {"label": "Host", "type": "string"},
|
||||
"port": {"label": "Port", "type": "int"},
|
||||
@@ -19,22 +20,17 @@ class CustomEmailBackend(EmailBackend):
|
||||
"sender": {"label": "Sender Email", "type": "string"},
|
||||
"recipients": {"label": "Recipient List", "type": "list"},
|
||||
"timeout": {"label": "Timeout", "type": "int", "default": 30}}
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
DEFAULT_BODY = smart_text(_("{{ job_friendly_name }} #{{ job.id }} had status {{ job.status }}, view details at {{ url }}\n\n{{ job_summary_dict }}"))
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
|
||||
"success": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
|
||||
"error": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY}}
|
||||
recipient_parameter = "recipients"
|
||||
sender_parameter = "sender"
|
||||
|
||||
default_messages = {"started": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"success": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"error": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"workflow_approval": {"running": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"approved": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"timed_out": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"denied": {"message": DEFAULT_MSG, "body": DEFAULT_BODY}}}
|
||||
|
||||
def format_body(self, body):
|
||||
if "body" in body:
|
||||
body_actual = body['body']
|
||||
else:
|
||||
body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
|
||||
body['friendly_name'], body['id'], body['status'], body['url'])
|
||||
)
|
||||
body_actual += json.dumps(body, indent=4)
|
||||
return body_actual
|
||||
# leave body unchanged (expect a string)
|
||||
return body
|
||||
|
||||
@@ -8,24 +8,21 @@ import dateutil.parser as dp
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.grafana_backend')
|
||||
|
||||
|
||||
class GrafanaBackend(AWXBaseEmailBackend):
|
||||
class GrafanaBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"grafana_url": {"label": "Grafana URL", "type": "string"},
|
||||
"grafana_key": {"label": "Grafana API Key", "type": "password"}}
|
||||
recipient_parameter = "grafana_url"
|
||||
sender_parameter = None
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, grafana_key,dashboardId=None, panelId=None, annotation_tags=None, grafana_no_verify_ssl=False, isRegion=True,
|
||||
fail_silently=False, **kwargs):
|
||||
super(GrafanaBackend, self).__init__(fail_silently=fail_silently)
|
||||
|
||||
@@ -7,12 +7,14 @@ import requests
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.hipchat_backend')
|
||||
|
||||
|
||||
class HipChatBackend(AWXBaseEmailBackend):
|
||||
class HipChatBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"token": {"label": "Token", "type": "password"},
|
||||
"rooms": {"label": "Destination Rooms", "type": "list"},
|
||||
@@ -23,11 +25,6 @@ class HipChatBackend(AWXBaseEmailBackend):
|
||||
recipient_parameter = "rooms"
|
||||
sender_parameter = "message_from"
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, token, color, api_url, notify, fail_silently=False, **kwargs):
|
||||
super(HipChatBackend, self).__init__(fail_silently=fail_silently)
|
||||
self.token = token
|
||||
|
||||
@@ -9,12 +9,14 @@ import irc.client
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.irc_backend')
|
||||
|
||||
|
||||
class IrcBackend(AWXBaseEmailBackend):
|
||||
class IrcBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"server": {"label": "IRC Server Address", "type": "string"},
|
||||
"port": {"label": "IRC Server Port", "type": "int"},
|
||||
@@ -25,11 +27,6 @@ class IrcBackend(AWXBaseEmailBackend):
|
||||
recipient_parameter = "targets"
|
||||
sender_parameter = None
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, server, port, nickname, password, use_ssl, fail_silently=False, **kwargs):
|
||||
super(IrcBackend, self).__init__(fail_silently=fail_silently)
|
||||
self.server = server
|
||||
|
||||
@@ -7,23 +7,20 @@ import json
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.mattermost_backend')
|
||||
|
||||
|
||||
class MattermostBackend(AWXBaseEmailBackend):
|
||||
class MattermostBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"mattermost_url": {"label": "Target URL", "type": "string"},
|
||||
"mattermost_no_verify_ssl": {"label": "Verify SSL", "type": "bool"}}
|
||||
recipient_parameter = "mattermost_url"
|
||||
sender_parameter = None
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, mattermost_no_verify_ssl=False, mattermost_channel=None, mattermost_username=None,
|
||||
mattermost_icon_url=None, fail_silently=False, **kwargs):
|
||||
super(MattermostBackend, self).__init__(fail_silently=fail_silently)
|
||||
|
||||
@@ -1,17 +1,23 @@
|
||||
# Copyright (c) 2016 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
import json
|
||||
import logging
|
||||
import pygerduty
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
DEFAULT_BODY = CustomNotificationBase.DEFAULT_BODY
|
||||
DEFAULT_MSG = CustomNotificationBase.DEFAULT_MSG
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.pagerduty_backend')
|
||||
|
||||
|
||||
class PagerDutyBackend(AWXBaseEmailBackend):
|
||||
class PagerDutyBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"subdomain": {"label": "Pagerduty subdomain", "type": "string"},
|
||||
"token": {"label": "API Token", "type": "password"},
|
||||
@@ -20,11 +26,14 @@ class PagerDutyBackend(AWXBaseEmailBackend):
|
||||
recipient_parameter = "service_key"
|
||||
sender_parameter = "client_name"
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
DEFAULT_BODY = "{{ job_summary_dict }}"
|
||||
default_messages = {"started": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
|
||||
"success": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
|
||||
"error": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY}}
|
||||
DEFAULT_BODY = "{{ job_metadata }}"
|
||||
default_messages = {"started": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"success": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"error": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"workflow_approval": {"running": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"approved": {"message": DEFAULT_MSG,"body": DEFAULT_BODY},
|
||||
"timed_out": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
|
||||
"denied": {"message": DEFAULT_MSG, "body": DEFAULT_BODY}}}
|
||||
|
||||
def __init__(self, subdomain, token, fail_silently=False, **kwargs):
|
||||
super(PagerDutyBackend, self).__init__(fail_silently=fail_silently)
|
||||
@@ -32,6 +41,16 @@ class PagerDutyBackend(AWXBaseEmailBackend):
|
||||
self.token = token
|
||||
|
||||
def format_body(self, body):
|
||||
# cast to dict if possible # TODO: is it true that this can be a dict or str?
|
||||
try:
|
||||
potential_body = json.loads(body)
|
||||
if isinstance(potential_body, dict):
|
||||
body = potential_body
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
# but it's okay if this is also just a string
|
||||
|
||||
return body
|
||||
|
||||
def send_messages(self, messages):
|
||||
|
||||
@@ -7,22 +7,20 @@ import json
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.rocketchat_backend')
|
||||
|
||||
|
||||
class RocketChatBackend(AWXBaseEmailBackend):
|
||||
class RocketChatBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"rocketchat_url": {"label": "Target URL", "type": "string"},
|
||||
"rocketchat_no_verify_ssl": {"label": "Verify SSL", "type": "bool"}}
|
||||
recipient_parameter = "rocketchat_url"
|
||||
sender_parameter = None
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, rocketchat_no_verify_ssl=False, rocketchat_username=None, rocketchat_icon_url=None, fail_silently=False, **kwargs):
|
||||
super(RocketChatBackend, self).__init__(fail_silently=fail_silently)
|
||||
|
||||
@@ -6,24 +6,21 @@ from slackclient import SlackClient
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.slack_backend')
|
||||
WEBSOCKET_TIMEOUT = 30
|
||||
|
||||
|
||||
class SlackBackend(AWXBaseEmailBackend):
|
||||
class SlackBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"token": {"label": "Token", "type": "password"},
|
||||
"channels": {"label": "Destination Channels", "type": "list"}}
|
||||
recipient_parameter = "channels"
|
||||
sender_parameter = None
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, token, hex_color="", fail_silently=False, **kwargs):
|
||||
super(SlackBackend, self).__init__(fail_silently=fail_silently)
|
||||
self.token = token
|
||||
@@ -50,6 +47,7 @@ class SlackBackend(AWXBaseEmailBackend):
|
||||
else:
|
||||
ret = connection.api_call("chat.postMessage",
|
||||
channel=r,
|
||||
as_user=True,
|
||||
text=m.subject)
|
||||
logger.debug(ret)
|
||||
if ret['ok']:
|
||||
|
||||
@@ -7,12 +7,14 @@ from twilio.rest import Client
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.twilio_backend')
|
||||
|
||||
|
||||
class TwilioBackend(AWXBaseEmailBackend):
|
||||
class TwilioBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"account_sid": {"label": "Account SID", "type": "string"},
|
||||
"account_token": {"label": "Account Token", "type": "password"},
|
||||
@@ -21,11 +23,6 @@ class TwilioBackend(AWXBaseEmailBackend):
|
||||
recipient_parameter = "to_numbers"
|
||||
sender_parameter = "from_number"
|
||||
|
||||
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
|
||||
default_messages = {"started": {"message": DEFAULT_SUBJECT},
|
||||
"success": {"message": DEFAULT_SUBJECT},
|
||||
"error": {"message": DEFAULT_SUBJECT}}
|
||||
|
||||
def __init__(self, account_sid, account_token, fail_silently=False, **kwargs):
|
||||
super(TwilioBackend, self).__init__(fail_silently=fail_silently)
|
||||
self.account_sid = account_sid
|
||||
|
||||
@@ -7,13 +7,15 @@ import requests
|
||||
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from awx.main.notifications.base import AWXBaseEmailBackend
|
||||
from awx.main.utils import get_awx_version
|
||||
from awx.main.notifications.custom_notification_base import CustomNotificationBase
|
||||
|
||||
logger = logging.getLogger('awx.main.notifications.webhook_backend')
|
||||
|
||||
|
||||
class WebhookBackend(AWXBaseEmailBackend):
|
||||
class WebhookBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
||||
|
||||
init_parameters = {"url": {"label": "Target URL", "type": "string"},
|
||||
"http_method": {"label": "HTTP Method", "type": "string", "default": "POST"},
|
||||
@@ -24,10 +26,16 @@ class WebhookBackend(AWXBaseEmailBackend):
|
||||
recipient_parameter = "url"
|
||||
sender_parameter = None
|
||||
|
||||
DEFAULT_BODY = "{{ job_summary_dict }}"
|
||||
DEFAULT_BODY = "{{ job_metadata }}"
|
||||
default_messages = {"started": {"body": DEFAULT_BODY},
|
||||
"success": {"body": DEFAULT_BODY},
|
||||
"error": {"body": DEFAULT_BODY}}
|
||||
"error": {"body": DEFAULT_BODY},
|
||||
"workflow_approval": {
|
||||
"running": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" needs review. '
|
||||
'This node can be viewed at: {{ workflow_url }}"}'},
|
||||
"approved": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" was approved. {{ workflow_url }}"}'},
|
||||
"timed_out": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" has timed out. {{ workflow_url }}"}'},
|
||||
"denied": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" was denied. {{ workflow_url }}"}'}}}
|
||||
|
||||
def __init__(self, http_method, headers, disable_ssl_verification=False, fail_silently=False, username=None, password=None, **kwargs):
|
||||
self.http_method = http_method
|
||||
@@ -38,15 +46,13 @@ class WebhookBackend(AWXBaseEmailBackend):
|
||||
super(WebhookBackend, self).__init__(fail_silently=fail_silently)
|
||||
|
||||
def format_body(self, body):
|
||||
# If `body` has body field, attempt to use this as the main body,
|
||||
# otherwise, leave it as a sub-field
|
||||
if isinstance(body, dict) and 'body' in body and isinstance(body['body'], str):
|
||||
try:
|
||||
potential_body = json.loads(body['body'])
|
||||
if isinstance(potential_body, dict):
|
||||
body = potential_body
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
# expect body to be a string representing a dict
|
||||
try:
|
||||
potential_body = json.loads(body)
|
||||
if isinstance(potential_body, dict):
|
||||
body = potential_body
|
||||
except json.JSONDecodeError:
|
||||
body = {}
|
||||
return body
|
||||
|
||||
def send_messages(self, messages):
|
||||
|
||||
@@ -12,10 +12,12 @@ class UriCleaner(object):
|
||||
|
||||
@staticmethod
|
||||
def remove_sensitive(cleartext):
|
||||
# exclude_list contains the items that will _not_ be redacted
|
||||
exclude_list = [settings.PUBLIC_GALAXY_SERVER['url']]
|
||||
if settings.PRIMARY_GALAXY_URL:
|
||||
exclude_list = [settings.PRIMARY_GALAXY_URL] + [server['url'] for server in settings.FALLBACK_GALAXY_SERVERS]
|
||||
else:
|
||||
exclude_list = [server['url'] for server in settings.FALLBACK_GALAXY_SERVERS]
|
||||
exclude_list += [settings.PRIMARY_GALAXY_URL]
|
||||
if settings.FALLBACK_GALAXY_SERVERS:
|
||||
exclude_list += [server['url'] for server in settings.FALLBACK_GALAXY_SERVERS]
|
||||
redactedtext = cleartext
|
||||
text_index = 0
|
||||
while True:
|
||||
|
||||
@@ -1,9 +1,5 @@
|
||||
import collections
|
||||
import os
|
||||
import stat
|
||||
import time
|
||||
import yaml
|
||||
import tempfile
|
||||
import logging
|
||||
from base64 import b64encode
|
||||
|
||||
@@ -88,8 +84,17 @@ class PodManager(object):
|
||||
|
||||
@cached_property
|
||||
def kube_api(self):
|
||||
my_client = config.new_client_from_config(config_file=self.kube_config)
|
||||
return client.CoreV1Api(api_client=my_client)
|
||||
# this feels a little janky, but it's what k8s' own code does
|
||||
# internally when it reads kube config files from disk:
|
||||
# https://github.com/kubernetes-client/python-base/blob/0b208334ef0247aad9afcaae8003954423b61a0d/config/kube_config.py#L643
|
||||
loader = config.kube_config.KubeConfigLoader(
|
||||
config_dict=self.kube_config
|
||||
)
|
||||
cfg = type.__call__(client.Configuration)
|
||||
loader.load_and_set(cfg)
|
||||
return client.CoreV1Api(api_client=client.ApiClient(
|
||||
configuration=cfg
|
||||
))
|
||||
|
||||
@property
|
||||
def pod_name(self):
|
||||
@@ -174,10 +179,4 @@ def generate_tmp_kube_config(credential, namespace):
|
||||
).decode() # decode the base64 data into a str
|
||||
else:
|
||||
config["clusters"][0]["cluster"]["insecure-skip-tls-verify"] = True
|
||||
|
||||
fd, path = tempfile.mkstemp(prefix='kubeconfig')
|
||||
with open(path, 'wb') as temp:
|
||||
temp.write(yaml.dump(config).encode())
|
||||
temp.flush()
|
||||
os.chmod(temp.name, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
|
||||
return path
|
||||
return config
|
||||
|
||||
@@ -252,19 +252,25 @@ class TaskManager():
|
||||
logger.debug('Submitting isolated {} to queue {} controlled by {}.'.format(
|
||||
task.log_format, task.execution_node, controller_node))
|
||||
elif rampart_group.is_containerized:
|
||||
# find one real, non-containerized instance with capacity to
|
||||
# act as the controller for k8s API interaction
|
||||
match = None
|
||||
for group in InstanceGroup.objects.all():
|
||||
if group.is_containerized or group.controller_id:
|
||||
continue
|
||||
match = group.find_largest_idle_instance()
|
||||
if match:
|
||||
break
|
||||
task.instance_group = rampart_group
|
||||
if not task.supports_isolation():
|
||||
if task.supports_isolation():
|
||||
task.controller_node = match.hostname
|
||||
else:
|
||||
# project updates and inventory updates don't *actually* run in pods,
|
||||
# so just pick *any* non-isolated, non-containerized host and use it
|
||||
for group in InstanceGroup.objects.all():
|
||||
if group.is_containerized or group.controller_id:
|
||||
continue
|
||||
match = group.find_largest_idle_instance()
|
||||
if match:
|
||||
task.execution_node = match.hostname
|
||||
logger.debug('Submitting containerized {} to queue {}.'.format(
|
||||
task.log_format, task.execution_node))
|
||||
break
|
||||
# as the execution node
|
||||
task.execution_node = match.hostname
|
||||
logger.debug('Submitting containerized {} to queue {}.'.format(
|
||||
task.log_format, task.execution_node))
|
||||
else:
|
||||
task.instance_group = rampart_group
|
||||
if instance is not None:
|
||||
|
||||
@@ -22,10 +22,6 @@ import yaml
|
||||
import fcntl
|
||||
from pathlib import Path
|
||||
from uuid import uuid4
|
||||
try:
|
||||
import psutil
|
||||
except Exception:
|
||||
psutil = None
|
||||
import urllib.parse as urlparse
|
||||
|
||||
# Django
|
||||
@@ -34,7 +30,6 @@ from django.db import transaction, DatabaseError, IntegrityError
|
||||
from django.db.models.fields.related import ForeignKey
|
||||
from django.utils.timezone import now, timedelta
|
||||
from django.utils.encoding import smart_str
|
||||
from django.core.mail import send_mail
|
||||
from django.contrib.auth.models import User
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.core.cache import cache
|
||||
@@ -72,7 +67,6 @@ from awx.main.isolated import manager as isolated_manager
|
||||
from awx.main.dispatch.publish import task
|
||||
from awx.main.dispatch import get_local_queuename, reaper
|
||||
from awx.main.utils import (get_ssh_version, update_scm_url,
|
||||
get_licenser,
|
||||
ignore_inventory_computed_fields,
|
||||
ignore_inventory_group_removal, extract_ansible_vars, schedule_task_manager,
|
||||
get_awx_version)
|
||||
@@ -92,7 +86,7 @@ from rest_framework.exceptions import PermissionDenied
|
||||
__all__ = ['RunJob', 'RunSystemJob', 'RunProjectUpdate', 'RunInventoryUpdate',
|
||||
'RunAdHocCommand', 'handle_work_error', 'handle_work_success', 'apply_cluster_membership_policies',
|
||||
'update_inventory_computed_fields', 'update_host_smart_inventory_memberships',
|
||||
'send_notifications', 'run_administrative_checks', 'purge_old_stdout_files']
|
||||
'send_notifications', 'purge_old_stdout_files']
|
||||
|
||||
HIDDEN_PASSWORD = '**********'
|
||||
|
||||
@@ -356,28 +350,6 @@ def gather_analytics():
|
||||
os.remove(tgz)
|
||||
|
||||
|
||||
@task()
|
||||
def run_administrative_checks():
|
||||
logger.warn("Running administrative checks.")
|
||||
if not settings.TOWER_ADMIN_ALERTS:
|
||||
return
|
||||
validation_info = get_licenser().validate()
|
||||
if validation_info['license_type'] != 'open' and validation_info.get('instance_count', 0) < 1:
|
||||
return
|
||||
used_percentage = float(validation_info.get('current_instances', 0)) / float(validation_info.get('instance_count', 100))
|
||||
tower_admin_emails = User.objects.filter(is_superuser=True).values_list('email', flat=True)
|
||||
if (used_percentage * 100) > 90:
|
||||
send_mail("Ansible Tower host usage over 90%",
|
||||
_("Ansible Tower host usage over 90%"),
|
||||
tower_admin_emails,
|
||||
fail_silently=True)
|
||||
if validation_info.get('date_warning', False):
|
||||
send_mail("Ansible Tower license will expire soon",
|
||||
_("Ansible Tower license will expire soon"),
|
||||
tower_admin_emails,
|
||||
fail_silently=True)
|
||||
|
||||
|
||||
@task(queue=get_local_queuename)
|
||||
def purge_old_stdout_files():
|
||||
nowtime = time.time()
|
||||
@@ -1423,7 +1395,6 @@ class BaseTask(object):
|
||||
def deploy_container_group_pod(self, task):
|
||||
from awx.main.scheduler.kubernetes import PodManager # Avoid circular import
|
||||
pod_manager = PodManager(self.instance)
|
||||
self.cleanup_paths.append(pod_manager.kube_config)
|
||||
try:
|
||||
log_name = task.log_format
|
||||
logger.debug(f"Launching pod for {log_name}.")
|
||||
@@ -1452,7 +1423,7 @@ class BaseTask(object):
|
||||
self.update_model(task.pk, execution_node=pod_manager.pod_name)
|
||||
return pod_manager
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -1959,9 +1930,15 @@ class RunProjectUpdate(BaseTask):
|
||||
env['PROJECT_UPDATE_ID'] = str(project_update.pk)
|
||||
env['ANSIBLE_CALLBACK_PLUGINS'] = self.get_path_to('..', 'plugins', 'callback')
|
||||
env['ANSIBLE_GALAXY_IGNORE'] = True
|
||||
# Set up the fallback server, which is the normal Ansible Galaxy by default
|
||||
galaxy_servers = list(settings.FALLBACK_GALAXY_SERVERS)
|
||||
# If private galaxy URL is non-blank, that means this feature is enabled
|
||||
# Set up the public Galaxy server, if enabled
|
||||
if settings.PUBLIC_GALAXY_ENABLED:
|
||||
galaxy_servers = [settings.PUBLIC_GALAXY_SERVER]
|
||||
else:
|
||||
galaxy_servers = []
|
||||
# Set up fallback Galaxy servers, if configured
|
||||
if settings.FALLBACK_GALAXY_SERVERS:
|
||||
galaxy_servers = settings.FALLBACK_GALAXY_SERVERS + galaxy_servers
|
||||
# Set up the primary Galaxy server, if configured
|
||||
if settings.PRIMARY_GALAXY_URL:
|
||||
galaxy_servers = [{'id': 'primary_galaxy'}] + galaxy_servers
|
||||
for key in GALAXY_SERVER_FIELDS:
|
||||
@@ -2354,6 +2331,27 @@ class RunInventoryUpdate(BaseTask):
|
||||
env[str(env_k)] = str(inventory_update.source_vars_dict[env_k])
|
||||
elif inventory_update.source == 'file':
|
||||
raise NotImplementedError('Cannot update file sources through the task system.')
|
||||
|
||||
if inventory_update.source == 'scm' and inventory_update.source_project_update:
|
||||
env_key = 'ANSIBLE_COLLECTIONS_PATHS'
|
||||
config_setting = 'collections_paths'
|
||||
folder = 'requirements_collections'
|
||||
default = '~/.ansible/collections:/usr/share/ansible/collections'
|
||||
|
||||
config_values = read_ansible_config(os.path.join(private_data_dir, 'project'), [config_setting])
|
||||
|
||||
paths = default.split(':')
|
||||
if env_key in env:
|
||||
for path in env[env_key].split(':'):
|
||||
if path not in paths:
|
||||
paths = [env[env_key]] + paths
|
||||
elif config_setting in config_values:
|
||||
for path in config_values[config_setting].split(':'):
|
||||
if path not in paths:
|
||||
paths = [config_values[config_setting]] + paths
|
||||
paths = [os.path.join(private_data_dir, folder)] + paths
|
||||
env[env_key] = os.pathsep.join(paths)
|
||||
|
||||
return env
|
||||
|
||||
def write_args_file(self, private_data_dir, args):
|
||||
@@ -2452,7 +2450,7 @@ class RunInventoryUpdate(BaseTask):
|
||||
# Use the vendored script path
|
||||
inventory_path = self.get_path_to('..', 'plugins', 'inventory', injector.script_name)
|
||||
elif src == 'scm':
|
||||
inventory_path = inventory_update.get_actual_source_path()
|
||||
inventory_path = os.path.join(private_data_dir, 'project', inventory_update.source_path)
|
||||
elif src == 'custom':
|
||||
handle, inventory_path = tempfile.mkstemp(dir=private_data_dir)
|
||||
f = os.fdopen(handle, 'w')
|
||||
@@ -2473,7 +2471,7 @@ class RunInventoryUpdate(BaseTask):
|
||||
'''
|
||||
src = inventory_update.source
|
||||
if src == 'scm' and inventory_update.source_project_update:
|
||||
return inventory_update.source_project_update.get_project_path(check_if_exists=False)
|
||||
return os.path.join(private_data_dir, 'project')
|
||||
if src in CLOUD_PROVIDERS:
|
||||
injector = None
|
||||
if src in InventorySource.injectors:
|
||||
@@ -2509,8 +2507,10 @@ class RunInventoryUpdate(BaseTask):
|
||||
|
||||
project_update_task = local_project_sync._get_task_class()
|
||||
try:
|
||||
project_update_task().run(local_project_sync.id)
|
||||
inventory_update.inventory_source.scm_last_revision = local_project_sync.project.scm_revision
|
||||
sync_task = project_update_task(job_private_data_dir=private_data_dir)
|
||||
sync_task.run(local_project_sync.id)
|
||||
local_project_sync.refresh_from_db()
|
||||
inventory_update.inventory_source.scm_last_revision = local_project_sync.scm_revision
|
||||
inventory_update.inventory_source.save(update_fields=['scm_last_revision'])
|
||||
except Exception:
|
||||
inventory_update = self.update_model(
|
||||
@@ -2518,6 +2518,13 @@ class RunInventoryUpdate(BaseTask):
|
||||
job_explanation=('Previous Task Failed: {"job_type": "%s", "job_name": "%s", "job_id": "%s"}' %
|
||||
('project_update', local_project_sync.name, local_project_sync.id)))
|
||||
raise
|
||||
elif inventory_update.source == 'scm' and inventory_update.launch_type == 'scm' and source_project:
|
||||
# This follows update, not sync, so make copy here
|
||||
project_path = source_project.get_project_path(check_if_exists=False)
|
||||
RunProjectUpdate.make_local_copy(
|
||||
project_path, os.path.join(private_data_dir, 'project'),
|
||||
source_project.scm_type, source_project.scm_revision
|
||||
)
|
||||
|
||||
|
||||
@task()
|
||||
|
||||
45
awx/main/tests/functional/api/test_events.py
Normal file
45
awx/main/tests/functional/api/test_events.py
Normal file
@@ -0,0 +1,45 @@
|
||||
import pytest
|
||||
|
||||
from awx.api.versioning import reverse
|
||||
from awx.main.models import AdHocCommand, AdHocCommandEvent, JobEvent
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize('truncate, expected', [
|
||||
(True, False),
|
||||
(False, True),
|
||||
])
|
||||
def test_job_events_sublist_truncation(get, organization_factory, job_template_factory, truncate, expected):
|
||||
objs = organization_factory("org", superusers=['admin'])
|
||||
jt = job_template_factory("jt", organization=objs.organization,
|
||||
inventory='test_inv', project='test_proj').job_template
|
||||
job = jt.create_unified_job()
|
||||
JobEvent.create_from_data(job_id=job.pk, uuid='abc123', event='runner_on_start',
|
||||
stdout='a' * 1025)
|
||||
|
||||
url = reverse('api:job_job_events_list', kwargs={'pk': job.pk})
|
||||
if not truncate:
|
||||
url += '?no_truncate=1'
|
||||
|
||||
response = get(url, user=objs.superusers.admin, expect=200)
|
||||
assert (len(response.data['results'][0]['stdout']) == 1025) == expected
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize('truncate, expected', [
|
||||
(True, False),
|
||||
(False, True),
|
||||
])
|
||||
def test_ad_hoc_events_sublist_truncation(get, organization_factory, job_template_factory, truncate, expected):
|
||||
objs = organization_factory("org", superusers=['admin'])
|
||||
adhoc = AdHocCommand()
|
||||
adhoc.save()
|
||||
AdHocCommandEvent.create_from_data(ad_hoc_command_id=adhoc.pk, uuid='abc123', event='runner_on_start',
|
||||
stdout='a' * 1025)
|
||||
|
||||
url = reverse('api:ad_hoc_command_ad_hoc_command_events_list', kwargs={'pk': adhoc.pk})
|
||||
if not truncate:
|
||||
url += '?no_truncate=1'
|
||||
|
||||
response = get(url, user=objs.superusers.admin, expect=200)
|
||||
assert (len(response.data['results'][0]['stdout']) == 1025) == expected
|
||||
@@ -117,3 +117,10 @@ def test_handle_content_type(post, admin):
|
||||
admin,
|
||||
content_type='text/html',
|
||||
expect=415)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_basic_not_found(get, admin_user):
|
||||
root_url = reverse('api:api_v2_root_view')
|
||||
r = get(root_url + 'fooooooo', user=admin_user, expect=404)
|
||||
assert r.data.get('detail') == 'The requested resource could not be found.'
|
||||
|
||||
@@ -45,6 +45,14 @@ def isolated_instance_group(instance_group, instance):
|
||||
return ig
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def containerized_instance_group(instance_group, kube_credential):
|
||||
ig = InstanceGroup(name="container")
|
||||
ig.credential = kube_credential
|
||||
ig.save()
|
||||
return ig
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def create_job_factory(job_factory, instance_group):
|
||||
def fn(status='running'):
|
||||
@@ -240,3 +248,29 @@ def test_instance_group_order_persistence(get, post, admin, source_model):
|
||||
resp = get(url, admin)
|
||||
assert resp.data['count'] == total
|
||||
assert [ig['name'] for ig in resp.data['results']] == [ig.name for ig in before]
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_instance_group_update_fields(patch, instance, instance_group, admin, containerized_instance_group):
|
||||
# policy_instance_ variables can only be updated in instance groups that are NOT containerized
|
||||
# instance group (not containerized)
|
||||
ig_url = reverse("api:instance_group_detail", kwargs={'pk': instance_group.pk})
|
||||
assert not instance_group.is_containerized
|
||||
assert not containerized_instance_group.is_isolated
|
||||
resp = patch(ig_url, {'policy_instance_percentage':15}, admin, expect=200)
|
||||
assert 15 == resp.data['policy_instance_percentage']
|
||||
resp = patch(ig_url, {'policy_instance_minimum':15}, admin, expect=200)
|
||||
assert 15 == resp.data['policy_instance_minimum']
|
||||
resp = patch(ig_url, {'policy_instance_list':[instance.hostname]}, admin)
|
||||
assert [instance.hostname] == resp.data['policy_instance_list']
|
||||
|
||||
# containerized instance group
|
||||
cg_url = reverse("api:instance_group_detail", kwargs={'pk': containerized_instance_group.pk})
|
||||
assert containerized_instance_group.is_containerized
|
||||
assert not containerized_instance_group.is_isolated
|
||||
resp = patch(cg_url, {'policy_instance_percentage':15}, admin, expect=400)
|
||||
assert ["Containerized instances may not be managed via the API"] == resp.data['policy_instance_percentage']
|
||||
resp = patch(cg_url, {'policy_instance_minimum':15}, admin, expect=400)
|
||||
assert ["Containerized instances may not be managed via the API"] == resp.data['policy_instance_minimum']
|
||||
resp = patch(cg_url, {'policy_instance_list':[instance.hostname]}, admin)
|
||||
assert ["Containerized instances may not be managed via the API"] == resp.data['policy_instance_list']
|
||||
|
||||
@@ -8,6 +8,8 @@ from unittest.mock import PropertyMock
|
||||
|
||||
# Django
|
||||
from django.urls import resolve
|
||||
from django.http import Http404
|
||||
from django.core.handlers.exception import response_for_exception
|
||||
from django.contrib.auth.models import User
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.db.backends.sqlite3.base import SQLiteCursorWrapper
|
||||
@@ -581,8 +583,12 @@ def _request(verb):
|
||||
if 'format' not in kwargs and 'content_type' not in kwargs:
|
||||
kwargs['format'] = 'json'
|
||||
|
||||
view, view_args, view_kwargs = resolve(urllib.parse.urlparse(url)[2])
|
||||
request = getattr(APIRequestFactory(), verb)(url, **kwargs)
|
||||
request_error = None
|
||||
try:
|
||||
view, view_args, view_kwargs = resolve(urllib.parse.urlparse(url)[2])
|
||||
except Http404 as e:
|
||||
request_error = e
|
||||
if isinstance(kwargs.get('cookies', None), dict):
|
||||
for key, value in kwargs['cookies'].items():
|
||||
request.COOKIES[key] = value
|
||||
@@ -591,7 +597,10 @@ def _request(verb):
|
||||
if user:
|
||||
force_authenticate(request, user=user)
|
||||
|
||||
response = view(request, *view_args, **view_kwargs)
|
||||
if not request_error:
|
||||
response = view(request, *view_args, **view_kwargs)
|
||||
else:
|
||||
response = response_for_exception(request, request_error)
|
||||
if middleware:
|
||||
middleware.process_response(request, response)
|
||||
if expect:
|
||||
|
||||
@@ -87,7 +87,7 @@ class TestJobNotificationMixin(object):
|
||||
'use_fact_cache': bool,
|
||||
'verbosity': int},
|
||||
'job_friendly_name': str,
|
||||
'job_summary_dict': str,
|
||||
'job_metadata': str,
|
||||
'url': str}
|
||||
|
||||
|
||||
@@ -144,5 +144,3 @@ class TestJobNotificationMixin(object):
|
||||
|
||||
context_stub = JobNotificationMixin.context_stub()
|
||||
check_structure_and_completeness(TestJobNotificationMixin.CONTEXT_STRUCTURE, context_stub)
|
||||
|
||||
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import subprocess
|
||||
import yaml
|
||||
import base64
|
||||
|
||||
from unittest import mock # noqa
|
||||
@@ -51,6 +50,5 @@ def test_kubectl_ssl_verification(containerized_job):
|
||||
cred.inputs['ssl_ca_cert'] = cert.stdout
|
||||
cred.save()
|
||||
pm = PodManager(containerized_job)
|
||||
config = yaml.load(open(pm.kube_config), Loader=yaml.FullLoader)
|
||||
ca_data = config['clusters'][0]['cluster']['certificate-authority-data']
|
||||
ca_data = pm.kube_config['clusters'][0]['cluster']['certificate-authority-data']
|
||||
assert cert.stdout == base64.b64decode(ca_data.encode())
|
||||
|
||||
@@ -264,6 +264,7 @@ def test_inventory_update_injected_content(this_kind, script_or_plugin, inventor
|
||||
assert envvars.pop('ANSIBLE_INVENTORY_ENABLED') == ('auto' if use_plugin else 'script')
|
||||
set_files = bool(os.getenv("MAKE_INVENTORY_REFERENCE_FILES", 'false').lower()[0] not in ['f', '0'])
|
||||
env, content = read_content(private_data_dir, envvars, inventory_update)
|
||||
env.pop('ANSIBLE_COLLECTIONS_PATHS', None) # collection paths not relevant to this test
|
||||
base_dir = os.path.join(DATA, script_or_plugin)
|
||||
if not os.path.exists(base_dir):
|
||||
os.mkdir(base_dir)
|
||||
|
||||
@@ -43,7 +43,7 @@ def test_basic_parameterization(get, post, user, organization):
|
||||
assert 'url' in response.data['notification_configuration']
|
||||
assert 'headers' in response.data['notification_configuration']
|
||||
assert 'messages' in response.data
|
||||
assert response.data['messages'] == {'started': None, 'success': None, 'error': None}
|
||||
assert response.data['messages'] == {'started': None, 'success': None, 'error': None, 'workflow_approval': None}
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
|
||||
@@ -19,6 +19,8 @@ from awx.main.models import (
|
||||
Credential
|
||||
)
|
||||
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
|
||||
from crum import impersonate
|
||||
|
||||
|
||||
@@ -252,7 +254,8 @@ class TestJobRelaunchAccess:
|
||||
|
||||
assert 'job_var' in job.launch_config.extra_data
|
||||
assert bob.can_access(Job, 'start', job, validate_license=False)
|
||||
assert not alice.can_access(Job, 'start', job, validate_license=False)
|
||||
with pytest.raises(PermissionDenied):
|
||||
alice.can_access(Job, 'start', job, validate_license=False)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
|
||||
@@ -7,6 +7,8 @@ from awx.main.access import (
|
||||
# WorkflowJobNodeAccess
|
||||
)
|
||||
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
|
||||
from awx.main.models import InventorySource, JobLaunchConfig
|
||||
|
||||
|
||||
@@ -169,7 +171,8 @@ class TestWorkflowJobAccess:
|
||||
wfjt.ask_inventory_on_launch = True
|
||||
wfjt.save()
|
||||
JobLaunchConfig.objects.create(job=workflow_job, inventory=inventory)
|
||||
assert not WorkflowJobAccess(rando).can_start(workflow_job)
|
||||
with pytest.raises(PermissionDenied):
|
||||
WorkflowJobAccess(rando).can_start(workflow_job)
|
||||
inventory.use_role.members.add(rando)
|
||||
assert WorkflowJobAccess(rando).can_start(workflow_job)
|
||||
|
||||
|
||||
@@ -26,7 +26,7 @@ class TestNotificationTemplateSerializer():
|
||||
{'started': {'message': '{{ job.id }}', 'body': '{{ job.status }}'},
|
||||
'success': {'message': None, 'body': '{{ job_friendly_name }}'},
|
||||
'error': {'message': '{{ url }}', 'body': None}},
|
||||
{'started': {'body': '{{ job_summary_dict }}'}},
|
||||
{'started': {'body': '{{ job_metadata }}'}},
|
||||
{'started': {'body': '{{ job.summary_fields.inventory.total_hosts }}'}},
|
||||
{'started': {'body': u'Iñtërnâtiônàlizætiøn'}}
|
||||
])
|
||||
|
||||
@@ -234,6 +234,14 @@ class TestWorkflowJobNodeJobKWARGS:
|
||||
job_node_no_prompts.unified_job_template = project_unit
|
||||
assert job_node_no_prompts.get_job_kwargs() == self.kwargs_base
|
||||
|
||||
def test_extra_vars_node_prompts(self, wfjt_node_no_prompts):
|
||||
wfjt_node_no_prompts.extra_vars = {'foo': 'bar'}
|
||||
assert wfjt_node_no_prompts.prompts_dict() == {'extra_vars': {'foo': 'bar'}}
|
||||
|
||||
def test_string_extra_vars_node_prompts(self, wfjt_node_no_prompts):
|
||||
wfjt_node_no_prompts.extra_vars = '{"foo": "bar"}'
|
||||
assert wfjt_node_no_prompts.prompts_dict() == {'extra_vars': {'foo': 'bar'}}
|
||||
|
||||
|
||||
def test_get_ask_mapping_integrity():
|
||||
assert list(WorkflowJobTemplate.get_ask_mapping().keys()) == ['extra_vars', 'inventory', 'limit', 'scm_branch']
|
||||
|
||||
@@ -5,7 +5,6 @@ from datetime import timedelta
|
||||
|
||||
|
||||
@pytest.mark.parametrize("job_name,function_path", [
|
||||
('admin_checks', 'awx.main.tasks.run_administrative_checks'),
|
||||
('tower_scheduler', 'awx.main.tasks.awx_periodic_scheduler'),
|
||||
])
|
||||
def test_CELERYBEAT_SCHEDULE(mocker, job_name, function_path):
|
||||
|
||||
@@ -288,24 +288,30 @@ class AWXProxyHandler(logging.Handler):
|
||||
'''
|
||||
|
||||
thread_local = threading.local()
|
||||
_auditor = None
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
# TODO: process 'level' kwarg
|
||||
super(AWXProxyHandler, self).__init__(**kwargs)
|
||||
self._handler = None
|
||||
self._old_kwargs = {}
|
||||
self._auditor = logging.handlers.RotatingFileHandler(
|
||||
filename='/var/log/tower/external.log',
|
||||
maxBytes=1024 * 1024 * 50, # 50 MB
|
||||
backupCount=5,
|
||||
)
|
||||
|
||||
class WritableLogstashFormatter(LogstashFormatter):
|
||||
@classmethod
|
||||
def serialize(cls, message):
|
||||
return json.dumps(message)
|
||||
@property
|
||||
def auditor(self):
|
||||
if not self._auditor:
|
||||
self._auditor = logging.handlers.RotatingFileHandler(
|
||||
filename='/var/log/tower/external.log',
|
||||
maxBytes=1024 * 1024 * 50, # 50 MB
|
||||
backupCount=5,
|
||||
)
|
||||
|
||||
self._auditor.setFormatter(WritableLogstashFormatter())
|
||||
class WritableLogstashFormatter(LogstashFormatter):
|
||||
@classmethod
|
||||
def serialize(cls, message):
|
||||
return json.dumps(message)
|
||||
|
||||
self._auditor.setFormatter(WritableLogstashFormatter())
|
||||
return self._auditor
|
||||
|
||||
def get_handler_class(self, protocol):
|
||||
return HANDLER_MAPPING.get(protocol, AWXNullHandler)
|
||||
@@ -340,8 +346,8 @@ class AWXProxyHandler(logging.Handler):
|
||||
if AWXProxyHandler.thread_local.enabled:
|
||||
actual_handler = self.get_handler()
|
||||
if settings.LOG_AGGREGATOR_AUDIT:
|
||||
self._auditor.setLevel(settings.LOG_AGGREGATOR_LEVEL)
|
||||
self._auditor.emit(record)
|
||||
self.auditor.setLevel(settings.LOG_AGGREGATOR_LEVEL)
|
||||
self.auditor.emit(record)
|
||||
return actual_handler.emit(record)
|
||||
|
||||
def perform_test(self, custom_settings):
|
||||
|
||||
@@ -49,12 +49,6 @@ else:
|
||||
DEBUG = True
|
||||
SQL_DEBUG = DEBUG
|
||||
|
||||
ADMINS = (
|
||||
# ('Your Name', 'your_email@domain.com'),
|
||||
)
|
||||
|
||||
MANAGERS = ADMINS
|
||||
|
||||
DATABASES = {
|
||||
'default': {
|
||||
'ENGINE': 'django.db.backends.sqlite3',
|
||||
@@ -382,34 +376,6 @@ AUTH_BASIC_ENABLED = True
|
||||
# If set, serve only minified JS for UI.
|
||||
USE_MINIFIED_JS = False
|
||||
|
||||
# Email address that error messages come from.
|
||||
SERVER_EMAIL = 'root@localhost'
|
||||
|
||||
# Default email address to use for various automated correspondence from
|
||||
# the site managers.
|
||||
DEFAULT_FROM_EMAIL = 'tower@localhost'
|
||||
|
||||
# Subject-line prefix for email messages send with django.core.mail.mail_admins
|
||||
# or ...mail_managers. Make sure to include the trailing space.
|
||||
EMAIL_SUBJECT_PREFIX = '[Tower] '
|
||||
|
||||
# The email backend to use. For possible shortcuts see django.core.mail.
|
||||
# The default is to use the SMTP backend.
|
||||
# Third-party backends can be specified by providing a Python path
|
||||
# to a module that defines an EmailBackend class.
|
||||
EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
|
||||
|
||||
# Host for sending email.
|
||||
EMAIL_HOST = 'localhost'
|
||||
|
||||
# Port for sending email.
|
||||
EMAIL_PORT = 25
|
||||
|
||||
# Optional SMTP authentication information for EMAIL_HOST.
|
||||
EMAIL_HOST_USER = ''
|
||||
EMAIL_HOST_PASSWORD = ''
|
||||
EMAIL_USE_TLS = False
|
||||
|
||||
# Default to skipping isolated host key checking (the initial connection will
|
||||
# hang on an interactive "The authenticity of host example.org can't be
|
||||
# established" message)
|
||||
@@ -457,10 +423,6 @@ CELERYBEAT_SCHEDULE = {
|
||||
'schedule': timedelta(seconds=30),
|
||||
'options': {'expires': 20,}
|
||||
},
|
||||
'admin_checks': {
|
||||
'task': 'awx.main.tasks.run_administrative_checks',
|
||||
'schedule': timedelta(days=30)
|
||||
},
|
||||
'cluster_heartbeat': {
|
||||
'task': 'awx.main.tasks.cluster_node_heartbeat',
|
||||
'schedule': timedelta(seconds=60),
|
||||
@@ -635,16 +597,18 @@ PRIMARY_GALAXY_USERNAME = ''
|
||||
PRIMARY_GALAXY_TOKEN = ''
|
||||
PRIMARY_GALAXY_PASSWORD = ''
|
||||
PRIMARY_GALAXY_AUTH_URL = ''
|
||||
# Settings for the fallback galaxy server(s), normally this is the
|
||||
# actual Ansible Galaxy site.
|
||||
# server options: 'id', 'url', 'username', 'password', 'token', 'auth_url'
|
||||
# To not use any fallback servers set this to []
|
||||
FALLBACK_GALAXY_SERVERS = [
|
||||
{
|
||||
'id': 'galaxy',
|
||||
'url': 'https://galaxy.ansible.com'
|
||||
}
|
||||
]
|
||||
|
||||
# Settings for the public galaxy server(s).
|
||||
PUBLIC_GALAXY_ENABLED = True
|
||||
PUBLIC_GALAXY_SERVER = {
|
||||
'id': 'galaxy',
|
||||
'url': 'https://galaxy.ansible.com'
|
||||
}
|
||||
|
||||
# List of dicts of fallback (additional) Galaxy servers. If configured, these
|
||||
# will be higher precedence than public Galaxy, but lower than primary Galaxy.
|
||||
# Available options: 'id', 'url', 'username', 'password', 'token', 'auth_url'
|
||||
FALLBACK_GALAXY_SERVERS = []
|
||||
|
||||
# Enable bubblewrap support for running jobs (playbook runs only).
|
||||
# Note: This setting may be overridden by database settings.
|
||||
@@ -978,9 +942,6 @@ FACT_CACHE_PORT = 6564
|
||||
ORG_ADMINS_CAN_SEE_ALL_USERS = True
|
||||
MANAGE_ORGANIZATION_AUTH = True
|
||||
|
||||
# Note: This setting may be overridden by database settings.
|
||||
TOWER_ADMIN_ALERTS = True
|
||||
|
||||
# Note: This setting may be overridden by database settings.
|
||||
TOWER_URL_BASE = "https://towerhost"
|
||||
|
||||
@@ -1061,11 +1022,6 @@ LOGGING = {
|
||||
'formatter': 'json',
|
||||
'filters': ['external_log_enabled', 'dynamic_level_filter'],
|
||||
},
|
||||
'mail_admins': {
|
||||
'level': 'ERROR',
|
||||
'filters': ['require_debug_false'],
|
||||
'class': 'django.utils.log.AdminEmailHandler',
|
||||
},
|
||||
'tower_warnings': {
|
||||
# don't define a level here, it's set by settings.LOG_AGGREGATOR_LEVEL
|
||||
'class': 'logging.handlers.RotatingFileHandler',
|
||||
|
||||
@@ -52,6 +52,9 @@ COLOR_LOGS = True
|
||||
# Pipe management playbook output to console
|
||||
LOGGING['loggers']['awx.isolated.manager.playbooks']['propagate'] = True # noqa
|
||||
|
||||
# celery is annoyingly loud when docker containers start
|
||||
LOGGING['loggers'].pop('celery', None) # noqa
|
||||
|
||||
ALLOWED_HOSTS = ['*']
|
||||
|
||||
mimetypes.add_type("image/svg+xml", ".svg", True)
|
||||
|
||||
@@ -15,12 +15,6 @@ import os
|
||||
import urllib.parse
|
||||
import sys
|
||||
|
||||
ADMINS = (
|
||||
# ('Your Name', 'your_email@domain.com'),
|
||||
)
|
||||
|
||||
MANAGERS = ADMINS
|
||||
|
||||
# Enable the following lines and install the browser extension to use Django debug toolbar
|
||||
# if your deployment method is not VMWare of Docker-for-Mac you may
|
||||
# need a different IP address from request.META['REMOTE_ADDR']
|
||||
@@ -69,7 +63,7 @@ CHANNEL_LAYERS = {
|
||||
|
||||
# Absolute filesystem path to the directory to host projects (with playbooks).
|
||||
# This directory should NOT be web-accessible.
|
||||
PROJECTS_ROOT = '/projects/'
|
||||
PROJECTS_ROOT = '/var/lib/awx/projects/'
|
||||
|
||||
# Absolute filesystem path to the directory for job status stdout
|
||||
# This directory should not be web-accessible
|
||||
@@ -117,38 +111,6 @@ PROXY_IP_WHITELIST = []
|
||||
# If set, use -vvv for project updates instead of -v for more output.
|
||||
# PROJECT_UPDATE_VVV=True
|
||||
|
||||
###############################################################################
|
||||
# EMAIL SETTINGS
|
||||
###############################################################################
|
||||
|
||||
# Email address that error messages come from.
|
||||
SERVER_EMAIL = 'root@localhost'
|
||||
|
||||
# The email backend to use. For possible shortcuts see django.core.mail.
|
||||
# The default is to use the SMTP backend.
|
||||
# Third-party backends can be specified by providing a Python path
|
||||
# to a module that defines an EmailBackend class.
|
||||
EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
|
||||
|
||||
# Host for sending email.
|
||||
EMAIL_HOST = 'localhost'
|
||||
|
||||
# Port for sending email.
|
||||
EMAIL_PORT = 25
|
||||
|
||||
# Optional SMTP authentication information for EMAIL_HOST.
|
||||
EMAIL_HOST_USER = ''
|
||||
EMAIL_HOST_PASSWORD = ''
|
||||
EMAIL_USE_TLS = False
|
||||
|
||||
# Default email address to use for various automated correspondence from
|
||||
# the site managers.
|
||||
DEFAULT_FROM_EMAIL = 'webmaster@localhost'
|
||||
|
||||
# Subject-line prefix for email messages send with django.core.mail.mail_admins
|
||||
# or ...mail_managers. Make sure to include the trailing space.
|
||||
EMAIL_SUBJECT_PREFIX = '[AWX] '
|
||||
|
||||
###############################################################################
|
||||
# LOGGING SETTINGS
|
||||
###############################################################################
|
||||
|
||||
@@ -12,12 +12,6 @@
|
||||
# MISC PROJECT SETTINGS
|
||||
###############################################################################
|
||||
|
||||
ADMINS = (
|
||||
# ('Your Name', 'your_email@domain.com'),
|
||||
)
|
||||
|
||||
MANAGERS = ADMINS
|
||||
|
||||
# Database settings to use PostgreSQL for development.
|
||||
DATABASES = {
|
||||
'default': {
|
||||
@@ -97,38 +91,6 @@ PROXY_IP_WHITELIST = []
|
||||
# If set, use -vvv for project updates instead of -v for more output.
|
||||
# PROJECT_UPDATE_VVV=True
|
||||
|
||||
###############################################################################
|
||||
# EMAIL SETTINGS
|
||||
###############################################################################
|
||||
|
||||
# Email address that error messages come from.
|
||||
SERVER_EMAIL = 'root@localhost'
|
||||
|
||||
# The email backend to use. For possible shortcuts see django.core.mail.
|
||||
# The default is to use the SMTP backend.
|
||||
# Third-party backends can be specified by providing a Python path
|
||||
# to a module that defines an EmailBackend class.
|
||||
EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
|
||||
|
||||
# Host for sending email.
|
||||
EMAIL_HOST = 'localhost'
|
||||
|
||||
# Port for sending email.
|
||||
EMAIL_PORT = 25
|
||||
|
||||
# Optional SMTP authentication information for EMAIL_HOST.
|
||||
EMAIL_HOST_USER = ''
|
||||
EMAIL_HOST_PASSWORD = ''
|
||||
EMAIL_USE_TLS = False
|
||||
|
||||
# Default email address to use for various automated correspondence from
|
||||
# the site managers.
|
||||
DEFAULT_FROM_EMAIL = 'webmaster@localhost'
|
||||
|
||||
# Subject-line prefix for email messages send with django.core.mail.mail_admins
|
||||
# or ...mail_managers. Make sure to include the trailing space.
|
||||
EMAIL_SUBJECT_PREFIX = '[AWX] '
|
||||
|
||||
###############################################################################
|
||||
# LOGGING SETTINGS
|
||||
###############################################################################
|
||||
|
||||
@@ -11,7 +11,6 @@ import awx
|
||||
# Django
|
||||
from django.utils import six
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.core.validators import URLValidator, _lazy_re_compile
|
||||
|
||||
# Django Auth LDAP
|
||||
import django_auth_ldap.config
|
||||
@@ -234,34 +233,12 @@ class AuthenticationBackendsField(fields.StringListField):
|
||||
|
||||
class LDAPServerURIField(fields.URLField):
|
||||
|
||||
tld_re = (
|
||||
r'\.' # dot
|
||||
r'(?!-)' # can't start with a dash
|
||||
r'(?:[a-z' + URLValidator.ul + r'0-9' + '-]{2,63}' # domain label, this line was changed from the original URLValidator
|
||||
r'|xn--[a-z0-9]{1,59})' # or punycode label
|
||||
r'(?<!-)' # can't end with a dash
|
||||
r'\.?' # may have a trailing dot
|
||||
)
|
||||
|
||||
host_re = '(' + URLValidator.hostname_re + URLValidator.domain_re + tld_re + '|localhost)'
|
||||
|
||||
regex = _lazy_re_compile(
|
||||
r'^(?:[a-z0-9\.\-\+]*)://' # scheme is validated separately
|
||||
r'(?:[^\s:@/]+(?::[^\s:@/]*)?@)?' # user:pass authentication
|
||||
r'(?:' + URLValidator.ipv4_re + '|' + URLValidator.ipv6_re + '|' + host_re + ')'
|
||||
r'(?::\d{2,5})?' # port
|
||||
r'(?:[/?#][^\s]*)?' # resource path
|
||||
r'\Z', re.IGNORECASE)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
|
||||
kwargs.setdefault('schemes', ('ldap', 'ldaps'))
|
||||
kwargs.setdefault('allow_plain_hostname', True)
|
||||
kwargs.setdefault('regex', LDAPServerURIField.regex)
|
||||
super(LDAPServerURIField, self).__init__(**kwargs)
|
||||
|
||||
def run_validators(self, value):
|
||||
|
||||
for url in filter(None, re.split(r'[, ]', (value or ''))):
|
||||
super(LDAPServerURIField, self).run_validators(url)
|
||||
return value
|
||||
|
||||
@@ -282,10 +282,12 @@ function getLaunchedByDetails () {
|
||||
tooltip = strings.get('tooltips.SCHEDULE');
|
||||
link = `/#/templates/job_template/${jobTemplate.id}/schedules/${schedule.id}`;
|
||||
value = $filter('sanitize')(schedule.name);
|
||||
} else {
|
||||
} else if (schedule) {
|
||||
tooltip = null;
|
||||
link = null;
|
||||
value = $filter('sanitize')(schedule.name);
|
||||
} else {
|
||||
return null;
|
||||
}
|
||||
|
||||
return { label, link, tooltip, value };
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<!-- LEFT PANE HEADER ACTIONS -->
|
||||
<div class="JobResults-panelHeaderButtonActions">
|
||||
<!-- RELAUNCH ACTION -->
|
||||
<at-relaunch job="vm.job"></at-relaunch>
|
||||
<at-relaunch ng-if="vm.job" job="vm.job"></at-relaunch>
|
||||
|
||||
<!-- CANCEL ACTION -->
|
||||
<button
|
||||
|
||||
@@ -213,8 +213,8 @@ function JobRenderService ($q, $compile, $sce, $window) {
|
||||
const record = this.createRecord(event, lines);
|
||||
|
||||
if (lines.length === 1 && lines[0] === '') {
|
||||
// Some events, mainly runner_on_start events, have an actual line count of 1
|
||||
// (stdout = '') and a claimed line count of 0 (end_line - start_line = 0).
|
||||
// runner_on_start, runner_on_ok, and a few other events have an actual line count
|
||||
// of 1 (stdout = '') and a claimed line count of 0 (end_line - start_line = 0).
|
||||
// Since a zero-length string has an actual line count of 1, they'll still get
|
||||
// rendered as blank lines unless we intercept them and add some special
|
||||
// handling to remove them.
|
||||
|
||||
@@ -208,6 +208,7 @@
|
||||
max-width: none !important;
|
||||
width: 100% !important;
|
||||
padding-right: 0px !important;
|
||||
margin-top: 10px;
|
||||
}
|
||||
|
||||
.Form-formGroup--checkbox{
|
||||
|
||||
@@ -15,7 +15,9 @@
|
||||
title="{{ label || vm.strings.get('code_mirror.label.VARIABLES') }}"
|
||||
tabindex="-1"
|
||||
ng-if="tooltip">
|
||||
<i class="fa fa-question-circle"></i>
|
||||
<span class="at-Popover-icon" ng-class="{ 'at-Popover-icon--defaultCursor': popover.on === 'mouseenter' && !popover.click }">
|
||||
<i class="fa fa-question-circle"></i>
|
||||
</span>
|
||||
</a>
|
||||
<div class="atCodeMirror-toggleContainer FormToggle-container">
|
||||
<div id="{{ name }}_parse_type" class="btn-group">
|
||||
|
||||
@@ -202,6 +202,7 @@
|
||||
.at-Row-toggle {
|
||||
align-self: flex-start;
|
||||
margin-right: @at-space-4x;
|
||||
margin-left: 15px;
|
||||
}
|
||||
|
||||
.at-Row-actions {
|
||||
@@ -385,29 +386,3 @@
|
||||
margin-right: @at-margin-right-list-row-item-inline-label;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: @at-breakpoint-instances-wrap) {
|
||||
.at-Row-items--instances {
|
||||
margin-bottom: @at-padding-bottom-instances-wrap;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: @at-breakpoint-compact-list) {
|
||||
.at-Row-actions {
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.at-RowAction {
|
||||
margin: @at-margin-list-row-action-mobile;
|
||||
}
|
||||
|
||||
.at-RowItem--inline {
|
||||
display: flex;
|
||||
margin-right: inherit;
|
||||
|
||||
.at-RowItem-label {
|
||||
width: @at-width-list-row-item-label;
|
||||
margin-right: inherit;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -89,6 +89,9 @@ export default ['i18n', function(i18n) {
|
||||
type: 'text',
|
||||
reset: 'PRIMARY_GALAXY_AUTH_URL',
|
||||
},
|
||||
PUBLIC_GALAXY_ENABLED: {
|
||||
type: 'toggleSwitch',
|
||||
},
|
||||
AWX_TASK_ENV: {
|
||||
type: 'textarea',
|
||||
reset: 'AWX_TASK_ENV',
|
||||
|
||||
@@ -15,9 +15,6 @@ export default ['i18n', function(i18n) {
|
||||
type: 'text',
|
||||
reset: 'TOWER_URL_BASE',
|
||||
},
|
||||
TOWER_ADMIN_ALERTS: {
|
||||
type: 'toggleSwitch',
|
||||
},
|
||||
ORG_ADMINS_CAN_SEE_ALL_USERS: {
|
||||
type: 'toggleSwitch',
|
||||
},
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
.CapacityAdjuster {
|
||||
margin-right: @at-space-4x;
|
||||
margin-top: 15px;
|
||||
margin-left: -10px;
|
||||
position: relative;
|
||||
|
||||
&-valueLabel {
|
||||
bottom: @at-space-5x;
|
||||
top: -10px;
|
||||
color: @at-color-body-text;
|
||||
font-size: @at-font-size;
|
||||
position: absolute;
|
||||
|
||||
@@ -5,6 +5,8 @@ capacity-bar {
|
||||
font-size: @at-font-size;
|
||||
min-width: 100px;
|
||||
white-space: nowrap;
|
||||
margin-top: 5px;
|
||||
margin-bottom: 5px;
|
||||
|
||||
.CapacityBar {
|
||||
background-color: @default-bg;
|
||||
@@ -42,12 +44,4 @@ capacity-bar {
|
||||
text-align: right;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.Capacity-details--percentage {
|
||||
width: 40px;
|
||||
}
|
||||
|
||||
&:only-child {
|
||||
margin-right: 50px;
|
||||
}
|
||||
}
|
||||
@@ -12,6 +12,7 @@ function AddContainerGroupController(ToJSON, $scope, $state, models, strings, i1
|
||||
|
||||
vm.form = instanceGroup.createFormSchema('post');
|
||||
vm.form.name.required = true;
|
||||
delete vm.form.name.help_text;
|
||||
|
||||
vm.form.credential = {
|
||||
type: 'field',
|
||||
@@ -22,6 +23,7 @@ function AddContainerGroupController(ToJSON, $scope, $state, models, strings, i1
|
||||
vm.form.credential._route = "instanceGroups.addContainerGroup.credentials";
|
||||
vm.form.credential._model = credential;
|
||||
vm.form.credential._placeholder = strings.get('container.CREDENTIAL_PLACEHOLDER');
|
||||
vm.form.credential.help_text = strings.get('container.CREDENTIAL_HELP_TEXT');
|
||||
vm.form.credential.required = true;
|
||||
|
||||
vm.form.extraVars = {
|
||||
@@ -29,6 +31,7 @@ function AddContainerGroupController(ToJSON, $scope, $state, models, strings, i1
|
||||
value: DataSet.data.actions.POST.pod_spec_override.default,
|
||||
name: 'extraVars',
|
||||
toggleLabel: strings.get('container.POD_SPEC_TOGGLE'),
|
||||
tooltip: strings.get('container.EXTRA_VARS_HELP_TEXT')
|
||||
};
|
||||
|
||||
vm.tab = {
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
<div ui-view="credentials"></div>
|
||||
<a class="containerGroups-messageBar-link" href="https://docs.ansible.com/ansible-tower/latest/html/administration/external_execution_envs.html#container-group-considerations" target="_blank" style="color: white">
|
||||
<a class="containerGroups-messageBar-link" href="https://docs.ansible.com/ansible-tower/latest/html/administration/external_execution_envs.html#container-groups" target="_blank" style="color: white">
|
||||
<div class="Section-messageBar">
|
||||
<i class="Section-messageBar-warning fa fa-warning"></i>
|
||||
<span class="Section-messageBar-text">This feature is tech preview, and is subject to change in a future release. Click here for documentation.</span>
|
||||
<span class="Section-messageBar-text">This feature is currently in tech preview and is subject to change in a future release. Click here for documentation.</span>
|
||||
</div>
|
||||
</a>
|
||||
<at-panel>
|
||||
@@ -34,6 +34,7 @@
|
||||
variables="vm.form.extraVars.value"
|
||||
label="{{ vm.form.extraVars.label }}"
|
||||
name="{{ vm.form.extraVars.name }}"
|
||||
tooltip="{{ vm.form.extraVars.tooltip }}"
|
||||
>
|
||||
</at-code-mirror>
|
||||
</div>
|
||||
|
||||
@@ -27,6 +27,7 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
|
||||
vm.switchDisabled = false;
|
||||
vm.form.disabled = !instanceGroup.has('options', 'actions.PUT');
|
||||
vm.form.name.required = true;
|
||||
delete vm.form.name.help_text;
|
||||
vm.form.credential = {
|
||||
type: 'field',
|
||||
label: i18n._('Credential'),
|
||||
@@ -38,6 +39,7 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
|
||||
vm.form.credential._displayValue = EditContainerGroupDataset.data.summary_fields.credential.name;
|
||||
vm.form.credential.required = true;
|
||||
vm.form.credential._value = EditContainerGroupDataset.data.summary_fields.credential.id;
|
||||
vm.form.credential.help_text = strings.get('container.CREDENTIAL_HELP_TEXT');
|
||||
|
||||
vm.tab = {
|
||||
details: {
|
||||
@@ -59,7 +61,8 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
|
||||
label: strings.get('container.POD_SPEC_LABEL'),
|
||||
value: EditContainerGroupDataset.data.pod_spec_override || "---",
|
||||
name: 'extraVars',
|
||||
disabled: true
|
||||
disabled: true,
|
||||
tooltip: strings.get('container.EXTRA_VARS_HELP_TEXT')
|
||||
};
|
||||
vm.switchDisabled = true;
|
||||
} else {
|
||||
@@ -67,7 +70,8 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
|
||||
label: strings.get('container.POD_SPEC_LABEL'),
|
||||
value: EditContainerGroupDataset.data.pod_spec_override || instanceGroup.model.OPTIONS.actions.PUT.pod_spec_override.default,
|
||||
name: 'extraVars',
|
||||
toggleLabel: strings.get('container.POD_SPEC_TOGGLE')
|
||||
toggleLabel: strings.get('container.POD_SPEC_TOGGLE'),
|
||||
tooltip: strings.get('container.EXTRA_VARS_HELP_TEXT')
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -1,135 +1,100 @@
|
||||
.InstanceGroups {
|
||||
.at-Row-actions{
|
||||
justify-content: flex-start;
|
||||
width: 300px;
|
||||
& > capacity-bar:only-child{
|
||||
margin-left: 0px;
|
||||
margin-top: 5px
|
||||
}
|
||||
}
|
||||
.at-RowAction{
|
||||
margin: 0;
|
||||
}
|
||||
.at-Row-links{
|
||||
justify-content: flex-start;
|
||||
.at-Row--instances {
|
||||
.at-Row-content {
|
||||
flex-wrap: nowrap;
|
||||
}
|
||||
|
||||
.BreadCrumb-menuLinkImage:hover {
|
||||
color: @default-link;
|
||||
.at-Row-toggle {
|
||||
align-self: auto;
|
||||
flex: initial;
|
||||
}
|
||||
|
||||
.List-details {
|
||||
align-self: flex-end;
|
||||
color: @default-interface-txt;
|
||||
.at-Row-itemGroup {
|
||||
display: flex;
|
||||
flex: 0 0 auto;
|
||||
font-size: 12px;
|
||||
margin-right:20px;
|
||||
text-transform: uppercase;
|
||||
flex: 1;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.Capacity-details {
|
||||
.at-Row-items--instances {
|
||||
display: flex;
|
||||
margin-right: 20px;
|
||||
flex-wrap: wrap;
|
||||
align-items: center;
|
||||
|
||||
.Capacity-details--label {
|
||||
color: @default-interface-txt;
|
||||
margin: 0 10px 0 0;
|
||||
width: 100px;
|
||||
}
|
||||
align-content: center;
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.RunningJobs-details {
|
||||
align-items: center;
|
||||
display: flex;
|
||||
|
||||
.RunningJobs-details--label {
|
||||
margin: 0 10px 0 0;
|
||||
}
|
||||
.at-RowItem--isHeader {
|
||||
min-width: 250px;
|
||||
}
|
||||
|
||||
.List-tableCell--capacityColumn {
|
||||
.at-Row-items--capacity {
|
||||
display: flex;
|
||||
height: 40px;
|
||||
flex-wrap: wrap;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.List-noItems {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.List-tableRow .List-titleBadge {
|
||||
margin: 0 0 0 5px;
|
||||
}
|
||||
|
||||
.Panel-docsLink {
|
||||
cursor: pointer;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
padding: 7px;
|
||||
background: @at-white;
|
||||
border-radius: @at-border-radius;
|
||||
height: 30px;
|
||||
width: 30px;
|
||||
margin: 0 20px 0 auto;
|
||||
|
||||
i {
|
||||
font-size: @at-font-size-icon;
|
||||
color: @at-gray-646972;
|
||||
}
|
||||
}
|
||||
|
||||
.Panel-docsLink:hover {
|
||||
background-color: @at-blue;
|
||||
|
||||
i {
|
||||
color: @at-white;
|
||||
}
|
||||
}
|
||||
.at-Row-toggle{
|
||||
margin-top: 20px;
|
||||
padding-left: 15px;
|
||||
}
|
||||
|
||||
.ContainerGroups-codeMirror{
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.at-Row-container{
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.containerGroups-messageBar-link:hover{
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
@media screen and (max-width: 1060px) and (min-width: 769px){
|
||||
.at-Row-links {
|
||||
justify-content: flex-start;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (min-width: 1061px){
|
||||
.at-Row-actions{
|
||||
justify-content: flex-end;
|
||||
& > capacity-bar:only-child {
|
||||
margin-right: 30px;
|
||||
}
|
||||
}
|
||||
.instanceGroupsList-details{
|
||||
display: flex;
|
||||
}
|
||||
.at-Row-links {
|
||||
justify-content: flex-end;
|
||||
display: flex;
|
||||
width: 445px;
|
||||
}
|
||||
.CapacityAdjuster {
|
||||
padding-bottom: 15px;
|
||||
}
|
||||
}
|
||||
|
||||
.at-Row--instanceGroups {
|
||||
.at-Row-content {
|
||||
flex-wrap: nowrap;
|
||||
}
|
||||
|
||||
.at-Row-itemGroup {
|
||||
display: flex;
|
||||
flex: 1;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.at-Row-items--instanceGroups {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
align-items: center;
|
||||
flex: 1;
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
.at-Row-itemHeaderGroup {
|
||||
min-width: 320px;
|
||||
display: flex;
|
||||
}
|
||||
|
||||
.at-Row-items--capacity {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
align-items: center;
|
||||
margin-right: 5px;
|
||||
min-width: 215px;
|
||||
}
|
||||
|
||||
.at-Row--instanceSpacer {
|
||||
width: 140px;
|
||||
}
|
||||
|
||||
.at-Row--capacitySpacer {
|
||||
flex: .6;
|
||||
}
|
||||
|
||||
.at-Row-actions {
|
||||
min-width: 50px;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: 1260px) {
|
||||
.at-Row--instances .at-Row-items--capacity {
|
||||
flex: 1
|
||||
}
|
||||
|
||||
.at-Row--instances .CapacityAdjuster {
|
||||
padding-bottom: 5px;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: 600px) {
|
||||
.at-Row--instanceGroups .at-Row-itemHeaderGroup,
|
||||
.at-Row--instanceGroups .at-Row-itemGroup {
|
||||
max-width: 270px;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -72,8 +72,9 @@ function InstanceGroupsStrings(BaseString) {
|
||||
CREDENTIAL_PLACEHOLDER: t.s('SELECT A CREDENTIAL'),
|
||||
POD_SPEC_LABEL: t.s('Pod Spec Override'),
|
||||
BADGE_TEXT: t.s('Container Group'),
|
||||
POD_SPEC_TOGGLE: t.s('Customize Pod Spec')
|
||||
|
||||
POD_SPEC_TOGGLE: t.s('Customize Pod Spec'),
|
||||
CREDENTIAL_HELP_TEXT: t.s('Credential to authenticate with Kubernetes or OpenShift. Must be of type \"Kubernetes/OpenShift API Bearer Token\”.'),
|
||||
EXTRA_VARS_HELP_TEXT: t.s('Field for passing a custom Kubernetes or OpenShift Pod specification.')
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -43,35 +43,45 @@
|
||||
</at-list-toolbar>
|
||||
<at-list results='vm.instances'>
|
||||
<at-row ng-repeat="instance in vm.instances"
|
||||
ng-class="{'at-Row--active': (instance.id === vm.activeId)}">
|
||||
ng-class="{'at-Row--active': (instance.id === vm.activeId)}"
|
||||
class="at-Row--instances">
|
||||
<div class="at-Row-toggle">
|
||||
<at-switch on-toggle="vm.toggle(instance)" switch-on="instance.enabled" switch-disabled="vm.rowAction.toggle._disabled"></at-switch>
|
||||
</div>
|
||||
|
||||
<div class="at-Row-items at-Row-items--instances">
|
||||
<at-row-item
|
||||
header-value="{{ instance.hostname }}"
|
||||
header-tag="{{ instance.managed_by_policy ? '' : vm.strings.get('list.MANUAL') }}">
|
||||
</at-row-item>
|
||||
<at-row-item
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_RUNNING_JOBS') }}"
|
||||
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}, job_search: {status__in: ['running,waiting']}})"
|
||||
value="{{ instance.jobs_running }}"
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
<at-row-item
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_TOTAL_JOBS') }}"
|
||||
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}})"
|
||||
value="{{ instance.jobs_total }}"
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
</div>
|
||||
|
||||
<div class="at-Row-actions">
|
||||
<capacity-adjuster state="instance" disabled="{{vm.rowAction.capacity_adjustment._disabled}}"></capacity-adjuster>
|
||||
<capacity-bar label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}" capacity="instance.consumed_capacity" total-capacity="instance.capacity"></capacity-bar>
|
||||
<div class="at-Row-itemGroup">
|
||||
<div class="at-Row-items at-Row-items--instances">
|
||||
<at-row-item
|
||||
header-value="{{ instance.hostname }}"
|
||||
header-tag="{{ instance.managed_by_policy ? '' : vm.strings.get('list.MANUAL') }}">
|
||||
</at-row-item>
|
||||
<div class="at-Row-nonHeaderItems">
|
||||
<at-row-item
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_RUNNING_JOBS') }}"
|
||||
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}, job_search: {status__in: ['running,waiting']}})"
|
||||
value="{{ instance.jobs_running }}"
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
<at-row-item
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_TOTAL_JOBS') }}"
|
||||
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}})"
|
||||
value="{{ instance.jobs_total }}"
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
</div>
|
||||
</div>
|
||||
<div class="at-Row-items--capacity">
|
||||
<capacity-adjuster
|
||||
state="instance"
|
||||
disabled="{{vm.rowAction.capacity_adjustment._disabled}}">
|
||||
</capacity-adjuster>
|
||||
<capacity-bar
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}"
|
||||
capacity="instance.consumed_capacity"
|
||||
total-capacity="instance.capacity">
|
||||
</capacity-bar>
|
||||
</div>
|
||||
</div>
|
||||
</at-row>
|
||||
</at-list>
|
||||
|
||||
@@ -41,10 +41,11 @@
|
||||
</at-list-toolbar>
|
||||
<at-list results="instance_groups">
|
||||
<at-row ng-repeat="instance_group in instance_groups"
|
||||
ng-class="{'at-Row--active': (instance_group.id === vm.activeId)}" >
|
||||
<div class="at-Row-items">
|
||||
<div class="at-Row-container">
|
||||
<div class="at-Row-content">
|
||||
ng-class="{'at-Row--active': (instance_group.id === vm.activeId)}"
|
||||
class="at-Row--instanceGroups">
|
||||
<div class="at-Row-itemGroup">
|
||||
<div class="at-Row-items at-Row-items--instanceGroups">
|
||||
<div class="at-Row-itemHeaderGroup">
|
||||
<at-row-item
|
||||
ng-if="!instance_group.credential"
|
||||
header-value="{{ instance_group.name }}"
|
||||
@@ -67,23 +68,14 @@
|
||||
</div>
|
||||
</div>
|
||||
<div class="at-RowItem--labels" ng-if="!instance_group.credential">
|
||||
<div class="LabelList-tagContainer">
|
||||
<div class="LabelList-tag" ng-class="{'LabelList-tag--deletable' : (showDelete && template.summary_fields.user_capabilities.edit)}">
|
||||
<span class="LabelList-name">{{vm.strings.get('instance.BADGE_TEXT') }}</span>
|
||||
</div>
|
||||
<div class="LabelList-tagContainer">
|
||||
<div class="LabelList-tag" ng-class="{'LabelList-tag--deletable' : (showDelete && template.summary_fields.user_capabilities.edit)}">
|
||||
<span class="LabelList-name">{{vm.strings.get('instance.BADGE_TEXT') }}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="instanceGroupsList-details">
|
||||
<div class="at-Row-links">
|
||||
<at-row-item
|
||||
ng-if="!instance_group.credential"
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_INSTANCES') }}"
|
||||
label-state="instanceGroups.instances({instance_group_id: {{ instance_group.id }}})"
|
||||
value="{{ instance_group.instances }}"
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
<div class="at-Row-nonHeaderItems">
|
||||
<at-row-item
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_RUNNING_JOBS') }}"
|
||||
label-state="instanceGroups.jobs({instance_group_id: {{ instance_group.id }}, job_search: {status__in: ['running,waiting']}})"
|
||||
@@ -98,14 +90,38 @@
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
</div>
|
||||
<div class="at-Row-actions" >
|
||||
<capacity-bar ng-show="!instance_group.credential" label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}" capacity="instance_group.consumed_capacity" total-capacity="instance_group.capacity"></capacity-bar>
|
||||
<at-row-action icon="fa-trash" ng-click="vm.deleteInstanceGroup(instance_group)" ng-if="vm.rowAction.trash(instance_group)">
|
||||
</at-row-action>
|
||||
</div>
|
||||
<at-row-item
|
||||
ng-if="!instance_group.credential"
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_INSTANCES') }}"
|
||||
label-state="instanceGroups.instances({instance_group_id: {{ instance_group.id }}})"
|
||||
value="{{ instance_group.instances }}"
|
||||
inline="true"
|
||||
badge="true">
|
||||
</at-row-item>
|
||||
<div
|
||||
ng-if="instance_group.credential"
|
||||
class="at-Row--instanceSpacer">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="at-Row-items--capacity" ng-if="!instance_group.credential">
|
||||
<capacity-bar
|
||||
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}"
|
||||
capacity="instance_group.consumed_capacity"
|
||||
total-capacity="instance_group.capacity">
|
||||
</capacity-bar>
|
||||
</div>
|
||||
<div
|
||||
ng-if="instance_group.credential"
|
||||
class="at-Row--capacitySpacer">
|
||||
</div>
|
||||
</div>
|
||||
<div class="at-Row-actions" >
|
||||
<at-row-action
|
||||
icon="fa-trash"
|
||||
ng-click="vm.deleteInstanceGroup(instance_group)"
|
||||
ng-if="vm.rowAction.trash(instance_group)">
|
||||
</at-row-action>
|
||||
</div>
|
||||
</at-row>
|
||||
</at-list>
|
||||
|
||||
@@ -671,6 +671,98 @@ export default ['i18n', function(i18n) {
|
||||
"|| notification_type.value == 'webhook')",
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
approved_message: {
|
||||
label: i18n._('Workflow Approved Message'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && notification_type.value != 'webhook'",
|
||||
rows: 2,
|
||||
oneLine: 'true',
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
approved_body: {
|
||||
label: i18n._('Workflow Approved Message Body'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && " +
|
||||
"(notification_type.value == 'email' " +
|
||||
"|| notification_type.value == 'pagerduty' " +
|
||||
"|| notification_type.value == 'webhook')",
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
denied_message: {
|
||||
label: i18n._('Workflow Denied Message'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && notification_type.value != 'webhook'",
|
||||
rows: 2,
|
||||
oneLine: 'true',
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
denied_body: {
|
||||
label: i18n._('Workflow Denied Message Body'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && " +
|
||||
"(notification_type.value == 'email' " +
|
||||
"|| notification_type.value == 'pagerduty' " +
|
||||
"|| notification_type.value == 'webhook')",
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
running_message: {
|
||||
label: i18n._('Workflow Running Message'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && notification_type.value != 'webhook'",
|
||||
rows: 2,
|
||||
oneLine: 'true',
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
running_body: {
|
||||
label: i18n._('Workflow Running Message Body'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && " +
|
||||
"(notification_type.value == 'email' " +
|
||||
"|| notification_type.value == 'pagerduty' " +
|
||||
"|| notification_type.value == 'webhook')",
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
timed_out_message: {
|
||||
label: i18n._('Workflow Timed Out Message'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && notification_type.value != 'webhook'",
|
||||
rows: 2,
|
||||
oneLine: 'true',
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
timed_out_body: {
|
||||
label: i18n._('Workflow Timed Out Message Body'),
|
||||
class: 'Form-formGroup--fullWidth',
|
||||
type: 'syntax_highlight',
|
||||
mode: 'jinja2',
|
||||
default: '',
|
||||
ngShow: "customize_messages && " +
|
||||
"(notification_type.value == 'email' " +
|
||||
"|| notification_type.value == 'pagerduty' " +
|
||||
"|| notification_type.value == 'webhook')",
|
||||
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
|
||||
},
|
||||
},
|
||||
|
||||
buttons: { //for now always generates <button> tags
|
||||
|
||||
@@ -1,19 +1,20 @@
|
||||
|
||||
const emptyDefaults = {
|
||||
started: {
|
||||
message: '',
|
||||
body: '',
|
||||
},
|
||||
success: {
|
||||
message: '',
|
||||
body: '',
|
||||
},
|
||||
error: {
|
||||
message: '',
|
||||
body: '',
|
||||
},
|
||||
started: { message: '', body: '' },
|
||||
success: { message: '', body: '' },
|
||||
error: { message: '', body: '' },
|
||||
workflow_approval: {
|
||||
approved: { message: '', body: '' },
|
||||
denied: { message: '', body: '' },
|
||||
running: { message: '', body: '' },
|
||||
timed_out: { message: '', body: '' },
|
||||
}
|
||||
};
|
||||
|
||||
function getMessageIfUpdated(message, defaultValue) {
|
||||
return message === defaultValue ? null : message;
|
||||
}
|
||||
|
||||
export default [function() {
|
||||
return {
|
||||
getMessagesObj: function ($scope, defaultMessages) {
|
||||
@@ -23,22 +24,34 @@ export default [function() {
|
||||
const defaults = defaultMessages[$scope.notification_type.value] || {};
|
||||
return {
|
||||
started: {
|
||||
message: $scope.started_message === defaults.started.message ?
|
||||
null : $scope.started_message,
|
||||
body: $scope.started_body === defaults.started.body ?
|
||||
null : $scope.started_body,
|
||||
message: getMessageIfUpdated($scope.started_message, defaults.started.message),
|
||||
body: getMessageIfUpdated($scope.started_body, defaults.started.body),
|
||||
},
|
||||
success: {
|
||||
message: $scope.success_message === defaults.success.message ?
|
||||
null : $scope.success_message,
|
||||
body: $scope.success_body === defaults.success.body ?
|
||||
null : $scope.success_body,
|
||||
message: getMessageIfUpdated($scope.success_message, defaults.success.message),
|
||||
body: getMessageIfUpdated($scope.success_body, defaults.success.body),
|
||||
},
|
||||
error: {
|
||||
message: $scope.error_message === defaults.error.message ?
|
||||
null : $scope.error_message,
|
||||
body: $scope.error_body === defaults.error.body ?
|
||||
null : $scope.error_body,
|
||||
message: getMessageIfUpdated($scope.error_message, defaults.error.message),
|
||||
body: getMessageIfUpdated($scope.error_body, defaults.error.body),
|
||||
},
|
||||
workflow_approval: {
|
||||
approved: {
|
||||
message: getMessageIfUpdated($scope.approved_message, defaults.workflow_approval.approved.message),
|
||||
body: getMessageIfUpdated($scope.approved_body, defaults.workflow_approval.approved.body),
|
||||
},
|
||||
denied: {
|
||||
message: getMessageIfUpdated($scope.denied_message, defaults.workflow_approval.denied.message),
|
||||
body: getMessageIfUpdated($scope.denied_body, defaults.workflow_approval.denied.body),
|
||||
},
|
||||
running: {
|
||||
message: getMessageIfUpdated($scope.running_message, defaults.workflow_approval.running.message),
|
||||
body: getMessageIfUpdated($scope.running_body, defaults.workflow_approval.running.body),
|
||||
},
|
||||
timed_out: {
|
||||
message: getMessageIfUpdated($scope.timed_out_message, defaults.workflow_approval.timed_out.message),
|
||||
body: getMessageIfUpdated($scope.timed_out_body, defaults.workflow_approval.timed_out.body),
|
||||
},
|
||||
}
|
||||
};
|
||||
},
|
||||
@@ -56,6 +69,15 @@ export default [function() {
|
||||
$scope.success_body = defaults.success.body;
|
||||
$scope.error_message = defaults.error.message;
|
||||
$scope.error_body = defaults.error.body;
|
||||
$scope.approved_message = defaults.workflow_approval.approved.message;
|
||||
$scope.approved_body = defaults.workflow_approval.approved.body;
|
||||
$scope.denied_message = defaults.workflow_approval.denied.message;
|
||||
$scope.denied_body = defaults.workflow_approval.denied.body;
|
||||
$scope.running_message = defaults.workflow_approval.running.message;
|
||||
$scope.running_body = defaults.workflow_approval.running.body;
|
||||
$scope.timed_out_message = defaults.workflow_approval.timed_out.message;
|
||||
$scope.timed_out_body = defaults.workflow_approval.timed_out.body;
|
||||
|
||||
if (!messages) {
|
||||
return;
|
||||
}
|
||||
@@ -84,6 +106,48 @@ export default [function() {
|
||||
isCustomized = true;
|
||||
$scope.error_body = messages.error.body;
|
||||
}
|
||||
if (messages.workflow_approval) {
|
||||
if (messages.workflow_approval.approved &&
|
||||
messages.workflow_approval.approved.message) {
|
||||
isCustomized = true;
|
||||
$scope.approved_message = messages.workflow_approval.approved.message;
|
||||
}
|
||||
if (messages.workflow_approval.approved &&
|
||||
messages.workflow_approval.approved.body) {
|
||||
isCustomized = true;
|
||||
$scope.approved_body = messages.workflow_approval.approved.body;
|
||||
}
|
||||
if (messages.workflow_approval.denied &&
|
||||
messages.workflow_approval.denied.message) {
|
||||
isCustomized = true;
|
||||
$scope.denied_message = messages.workflow_approval.denied.message;
|
||||
}
|
||||
if (messages.workflow_approval.denied &&
|
||||
messages.workflow_approval.denied.body) {
|
||||
isCustomized = true;
|
||||
$scope.denied_body = messages.workflow_approval.denied.body;
|
||||
}
|
||||
if (messages.workflow_approval.running &&
|
||||
messages.workflow_approval.running.message) {
|
||||
isCustomized = true;
|
||||
$scope.running_message = messages.workflow_approval.running.message;
|
||||
}
|
||||
if (messages.workflow_approval.running &&
|
||||
messages.workflow_approval.running.body) {
|
||||
isCustomized = true;
|
||||
$scope.running_body = messages.workflow_approval.running.body;
|
||||
}
|
||||
if (messages.workflow_approval.timed_out &&
|
||||
messages.workflow_approval.timed_out.message) {
|
||||
isCustomized = true;
|
||||
$scope.timed_out_message = messages.workflow_approval.timed_out.message;
|
||||
}
|
||||
if (messages.workflow_approval.timed_out &&
|
||||
messages.workflow_approval.timed_out.body) {
|
||||
isCustomized = true;
|
||||
$scope.timed_out_body = messages.workflow_approval.timed_out.body;
|
||||
}
|
||||
}
|
||||
$scope.customize_messages = isCustomized;
|
||||
},
|
||||
|
||||
@@ -110,6 +174,30 @@ export default [function() {
|
||||
if ($scope.error_body === oldDefaults.error.body) {
|
||||
$scope.error_body = newDefaults.error.body;
|
||||
}
|
||||
if ($scope.approved_message === oldDefaults.workflow_approval.approved.message) {
|
||||
$scope.approved_message = newDefaults.workflow_approval.approved.message;
|
||||
}
|
||||
if ($scope.approved_body === oldDefaults.workflow_approval.approved.body) {
|
||||
$scope.approved_body = newDefaults.workflow_approval.approved.body;
|
||||
}
|
||||
if ($scope.denied_message === oldDefaults.workflow_approval.denied.message) {
|
||||
$scope.denied_message = newDefaults.workflow_approval.denied.message;
|
||||
}
|
||||
if ($scope.denied_body === oldDefaults.workflow_approval.denied.body) {
|
||||
$scope.denied_body = newDefaults.workflow_approval.denied.body;
|
||||
}
|
||||
if ($scope.running_message === oldDefaults.workflow_approval.running.message) {
|
||||
$scope.running_message = newDefaults.workflow_approval.running.message;
|
||||
}
|
||||
if ($scope.running_body === oldDefaults.workflow_approval.running.body) {
|
||||
$scope.running_body = newDefaults.workflow_approval.running.body;
|
||||
}
|
||||
if ($scope.timed_out_message === oldDefaults.workflow_approval.timed_out.message) {
|
||||
$scope.timed_out_message = newDefaults.workflow_approval.timed_out.message;
|
||||
}
|
||||
if ($scope.timed_out_body === oldDefaults.workflow_approval.timed_out.body) {
|
||||
$scope.timed_out_body = newDefaults.workflow_approval.timed_out.body;
|
||||
}
|
||||
}
|
||||
};
|
||||
}];
|
||||
|
||||
@@ -233,6 +233,38 @@ export default [ 'ProcessErrors', 'CredentialTypeModel', 'TemplatesStrings', '$f
|
||||
}, true);
|
||||
};
|
||||
|
||||
function getSelectedTags(tagId) {
|
||||
const selectedTags = [];
|
||||
const choiceElements = $(tagId).siblings(".select2").first()
|
||||
.find(".select2-selection__choice");
|
||||
choiceElements.each((index, option) => {
|
||||
selectedTags.push({
|
||||
value: option.title,
|
||||
name: option.title,
|
||||
label: option.title
|
||||
});
|
||||
});
|
||||
return selectedTags;
|
||||
}
|
||||
|
||||
function consolidateTags (tags, otherTags) {
|
||||
const seen = [];
|
||||
const consolidated = [];
|
||||
tags.forEach(tag => {
|
||||
if (!seen.includes(tag.value)) {
|
||||
seen.push(tag.value);
|
||||
consolidated.push(tag);
|
||||
}
|
||||
});
|
||||
otherTags.forEach(tag => {
|
||||
if (!seen.includes(tag.value)) {
|
||||
seen.push(tag.value);
|
||||
consolidated.push(tag);
|
||||
}
|
||||
});
|
||||
return consolidated;
|
||||
}
|
||||
|
||||
vm.next = (currentTab) => {
|
||||
if(_.has(vm, 'steps.other_prompts.tab._active') && vm.steps.other_prompts.tab._active === true){
|
||||
try {
|
||||
@@ -243,6 +275,22 @@ export default [ 'ProcessErrors', 'CredentialTypeModel', 'TemplatesStrings', '$f
|
||||
event.preventDefault();
|
||||
return;
|
||||
}
|
||||
|
||||
// The current tag input state lives somewhere in the associated select2
|
||||
// widgetry and isn't directly tied to the vm, so extract the tag values
|
||||
// and update the vm to keep it in sync.
|
||||
if (vm.promptDataClone.launchConf.ask_tags_on_launch) {
|
||||
vm.promptDataClone.prompts.tags.value = consolidateTags(
|
||||
angular.copy(vm.promptDataClone.prompts.tags.value),
|
||||
getSelectedTags("#job_launch_job_tags")
|
||||
);
|
||||
}
|
||||
if (vm.promptDataClone.launchConf.ask_skip_tags_on_launch) {
|
||||
vm.promptDataClone.prompts.skipTags.value = consolidateTags(
|
||||
angular.copy(vm.promptDataClone.prompts.skipTags.value),
|
||||
getSelectedTags("#job_launch_skip_tags")
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
let nextStep;
|
||||
|
||||
@@ -12,19 +12,6 @@ export default
|
||||
|
||||
let scope;
|
||||
|
||||
let consolidateTags = (tagModel, tagId) => {
|
||||
let tags = angular.copy(tagModel);
|
||||
$(tagId).siblings(".select2").first().find(".select2-selection__choice").each((optionIndex, option) => {
|
||||
tags.push({
|
||||
value: option.title,
|
||||
name: option.title,
|
||||
label: option.title
|
||||
});
|
||||
});
|
||||
|
||||
return [...tags.reduce((map, tag) => map.has(tag.value) ? map : map.set(tag.value, tag), new Map()).values()];
|
||||
};
|
||||
|
||||
vm.init = (_scope_) => {
|
||||
scope = _scope_;
|
||||
|
||||
@@ -35,14 +22,6 @@ export default
|
||||
|
||||
const surveyPasswords = {};
|
||||
|
||||
if (scope.promptData.launchConf.ask_tags_on_launch) {
|
||||
scope.promptData.prompts.tags.value = consolidateTags(scope.promptData.prompts.tags.value, "#job_launch_job_tags");
|
||||
}
|
||||
|
||||
if (scope.promptData.launchConf.ask_skip_tags_on_launch) {
|
||||
scope.promptData.prompts.skipTags.value = consolidateTags(scope.promptData.prompts.skipTags.value, "#job_launch_skip_tags");
|
||||
}
|
||||
|
||||
if (scope.promptData.launchConf.survey_enabled){
|
||||
scope.promptData.extraVars = ToJSON(scope.parseType, scope.promptData.prompts.variables.value, false);
|
||||
scope.promptData.surveyQuestions.forEach(surveyQuestion => {
|
||||
|
||||
@@ -241,7 +241,7 @@ export default ['NotificationsList', 'i18n', function(NotificationsList, i18n) {
|
||||
on-lookup-click="handleWebhookCredentialLookupClick"
|
||||
on-tag-delete="handleWebhookCredentialTagDelete"
|
||||
</webhook-credential-input>`,
|
||||
awPopOver: "<p>" + i18n._("Select the credential to use with the webhook service.") + "</p>",
|
||||
awPopOver: "<p>" + i18n._("Optionally, select the credential to use to send status updates back to the webhook service.") + "</p>",
|
||||
dataTitle: i18n._('Webhook Credential'),
|
||||
dataPlacement: 'right',
|
||||
dataContainer: "body",
|
||||
|
||||
146
awx/ui_next/package-lock.json
generated
146
awx/ui_next/package-lock.json
generated
@@ -1186,16 +1186,16 @@
|
||||
"integrity": "sha512-rLu3wcBWH4P5q1CGoSSH/i9hrXs7SlbRLkoq9IGuoPYNGQvDJ3pt/wmOM+XgYjIDRMVIdkUWt0RsfzF50JfnCw=="
|
||||
},
|
||||
"@fortawesome/fontawesome-common-types": {
|
||||
"version": "0.2.22",
|
||||
"resolved": "https://registry.npmjs.org/@fortawesome/fontawesome-common-types/-/fontawesome-common-types-0.2.22.tgz",
|
||||
"integrity": "sha512-QmEuZsipX5/cR9JOg0fsTN4Yr/9lieYWM8AQpmRa0eIfeOcl/HLYoEa366BCGRSrgNJEexuvOgbq9jnJ22IY5g=="
|
||||
"version": "0.2.25",
|
||||
"resolved": "https://registry.npmjs.org/@fortawesome/fontawesome-common-types/-/fontawesome-common-types-0.2.25.tgz",
|
||||
"integrity": "sha512-3RuZPDuuPELd7RXtUqTCfed14fcny9UiPOkdr2i+cYxBoTOfQgxcDoq77fHiiHcgWuo1LoBUpvGxFF1H/y7s3Q=="
|
||||
},
|
||||
"@fortawesome/free-brands-svg-icons": {
|
||||
"version": "5.10.2",
|
||||
"resolved": "https://registry.npmjs.org/@fortawesome/free-brands-svg-icons/-/free-brands-svg-icons-5.10.2.tgz",
|
||||
"integrity": "sha512-r5Dxr2h8f9bEI7F/gj/2v1OX9S6DMif9ZKR2VFQCSXHwahojLlOWnFILYsrjhzOISESkh6WDL9IOdkdbKM7KPw==",
|
||||
"version": "5.11.2",
|
||||
"resolved": "https://registry.npmjs.org/@fortawesome/free-brands-svg-icons/-/free-brands-svg-icons-5.11.2.tgz",
|
||||
"integrity": "sha512-wKK5znpHiZ2S0VgOvbeAnYuzkk3H86rxWajD9PVpfBj3s/kySEWTFKh/uLPyxiTOx8Tsd0OGN4En/s9XudVHLQ==",
|
||||
"requires": {
|
||||
"@fortawesome/fontawesome-common-types": "^0.2.22"
|
||||
"@fortawesome/fontawesome-common-types": "^0.2.25"
|
||||
}
|
||||
},
|
||||
"@jest/console": {
|
||||
@@ -1787,51 +1787,43 @@
|
||||
"dev": true
|
||||
},
|
||||
"@patternfly/patternfly": {
|
||||
"version": "2.27.0",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.27.0.tgz",
|
||||
"integrity": "sha512-sYSKUG3PL1KNKVw6bhijur0fS2pgfxWFmLCedxXaECt4KdKcg6rGvInzQnyGiQhWMVZBbWxFCWvBxBIr7L8ilA=="
|
||||
"version": "2.40.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.40.2.tgz",
|
||||
"integrity": "sha512-KCPQ6EL39xJen/B67MGv56i3h6bU5l7FD6f5IYU30z+ed2gM8zAYI3mPKNV05TMJv6+EQfp6O7dqCM3PJ8Q1yw=="
|
||||
},
|
||||
"@patternfly/react-core": {
|
||||
"version": "3.87.3",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.87.3.tgz",
|
||||
"integrity": "sha512-5kcuOIucqtnmGKjV13gRHEcU31AbNXNiNX3yhEhdeuG4gH00gFyUAxim3fkbYnR4OtGdU2MLvVOjoMfYj62rBQ==",
|
||||
"version": "3.120.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.120.2.tgz",
|
||||
"integrity": "sha512-PgV5w+3NlXK7hKvu0YY1pjXgd56dLwbIWE4m72JstxJIp/vpRShB6bfiSYNQGVi2ZQUudQTSH5sVWaBqXUaquw==",
|
||||
"requires": {
|
||||
"@patternfly/react-icons": "^3.10.17",
|
||||
"@patternfly/react-styles": "^3.5.13",
|
||||
"@patternfly/react-tokens": "^2.6.16",
|
||||
"@patternfly/react-icons": "^3.14.15",
|
||||
"@patternfly/react-styles": "^3.6.2",
|
||||
"@patternfly/react-tokens": "^2.7.2",
|
||||
"emotion": "^9.2.9",
|
||||
"exenv": "^1.2.2",
|
||||
"focus-trap-react": "^4.0.1",
|
||||
"tippy.js": "3.4.1"
|
||||
},
|
||||
"dependencies": {
|
||||
"@patternfly/react-icons": {
|
||||
"version": "3.12.0",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.12.0.tgz",
|
||||
"integrity": "sha512-45nP1m4La/LusrKVQNwkyZV3mqAJWhYgMce2+0VewAJ5ts4ygUJYzXR1vZQk09E8gTBCHMEdq0Af1xywwluHFg==",
|
||||
"requires": {
|
||||
"@fortawesome/free-brands-svg-icons": "^5.8.1"
|
||||
}
|
||||
},
|
||||
"@patternfly/react-tokens": {
|
||||
"version": "2.6.16",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.16.tgz",
|
||||
"integrity": "sha512-dhr4ne4thSmSKBr4anV07KSzUXEs6KpCMDxxNiwrgFdZwNHtyNcaPc+F9pQZ5A0n4qYmMLpCrprb7m5o/83riQ=="
|
||||
"version": "2.7.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.7.2.tgz",
|
||||
"integrity": "sha512-3QslQUErDLXGTzp2iGQNJD1UjZ+1NqwavOlsbxACUZ6LjXyJ7Y4TZbxDQrpgzPsD1SFPEVWufzpdjjtRBZ/b7g=="
|
||||
}
|
||||
}
|
||||
},
|
||||
"@patternfly/react-icons": {
|
||||
"version": "3.12.0",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.12.0.tgz",
|
||||
"integrity": "sha512-45nP1m4La/LusrKVQNwkyZV3mqAJWhYgMce2+0VewAJ5ts4ygUJYzXR1vZQk09E8gTBCHMEdq0Af1xywwluHFg==",
|
||||
"version": "3.14.15",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.14.15.tgz",
|
||||
"integrity": "sha512-7mIr1nzAXu6CdxKnhJGggIghx3DCaFXv6an+mfP/IwWifsLhcpE1c0iYkmVkvlI9X4cQAzeg9VfEGR7quhPOlA==",
|
||||
"requires": {
|
||||
"@fortawesome/free-brands-svg-icons": "^5.8.1"
|
||||
}
|
||||
},
|
||||
"@patternfly/react-styles": {
|
||||
"version": "3.5.13",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.5.13.tgz",
|
||||
"integrity": "sha512-aiyOp/n4cMxWhNmokG9EAFt06YmWDi3EdGfa5gyjYRwABGLUhyHo2r7kBqT3xxw0bLcOYDTPU94SaH63uAaRag==",
|
||||
"version": "3.6.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.6.2.tgz",
|
||||
"integrity": "sha512-WRXPC1R/qL+i/ANnrA0nEe6CcLHLZJIKWzSJ4gS2h9VdHvKySEdIlk9EtAZ0dNkv3whANjaKlR/n2/uFuXlzyw==",
|
||||
"requires": {
|
||||
"@babel/helper-plugin-utils": "^7.0.0-beta.48",
|
||||
"camel-case": "^3.0.0",
|
||||
@@ -1849,17 +1841,24 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"acorn": {
|
||||
"version": "6.3.0",
|
||||
"resolved": "https://registry.npmjs.org/acorn/-/acorn-6.3.0.tgz",
|
||||
"integrity": "sha512-/czfa8BwS88b9gWQVhc8eknunSA2DoJpJyTQkhheIf5E48u1N0R4q/YxxsAeqRrmK9TQ/uYfgLDfZo91UlANIA=="
|
||||
"version": "7.1.0",
|
||||
"resolved": "https://registry.npmjs.org/acorn/-/acorn-7.1.0.tgz",
|
||||
"integrity": "sha512-kL5CuoXA/dgxlBbVrflsflzQ3PAas7RYZB52NOm/6839iVYJgKMJ3cQJD+t2i5+qFa8h3MDpEOJiS64E8JLnSQ=="
|
||||
},
|
||||
"acorn-globals": {
|
||||
"version": "4.3.3",
|
||||
"resolved": "https://registry.npmjs.org/acorn-globals/-/acorn-globals-4.3.3.tgz",
|
||||
"integrity": "sha512-vkR40VwS2SYO98AIeFvzWWh+xyc2qi9s7OoXSFEGIP/rOJKzjnhykaZJNnHdoq4BL2gGxI5EZOU16z896EYnOQ==",
|
||||
"version": "4.3.4",
|
||||
"resolved": "https://registry.npmjs.org/acorn-globals/-/acorn-globals-4.3.4.tgz",
|
||||
"integrity": "sha512-clfQEh21R+D0leSbUdWf3OcfqyaCSAQ8Ryq00bofSekfr9W8u1jyYZo6ir0xu9Gtcf7BjcHJpnbZH7JOCpP60A==",
|
||||
"requires": {
|
||||
"acorn": "^6.0.1",
|
||||
"acorn-walk": "^6.0.1"
|
||||
},
|
||||
"dependencies": {
|
||||
"acorn": {
|
||||
"version": "6.3.0",
|
||||
"resolved": "https://registry.npmjs.org/acorn/-/acorn-6.3.0.tgz",
|
||||
"integrity": "sha512-/czfa8BwS88b9gWQVhc8eknunSA2DoJpJyTQkhheIf5E48u1N0R4q/YxxsAeqRrmK9TQ/uYfgLDfZo91UlANIA=="
|
||||
}
|
||||
}
|
||||
},
|
||||
"escodegen": {
|
||||
@@ -1880,16 +1879,16 @@
|
||||
"integrity": "sha1-/cpRzuYTOJXjyI1TXOSdv/YqRjM="
|
||||
},
|
||||
"jsdom": {
|
||||
"version": "15.1.1",
|
||||
"resolved": "https://registry.npmjs.org/jsdom/-/jsdom-15.1.1.tgz",
|
||||
"integrity": "sha512-cQZRBB33arrDAeCrAEWn1U3SvrvC8XysBua9Oqg1yWrsY/gYcusloJC3RZJXuY5eehSCmws8f2YeliCqGSkrtQ==",
|
||||
"version": "15.2.0",
|
||||
"resolved": "https://registry.npmjs.org/jsdom/-/jsdom-15.2.0.tgz",
|
||||
"integrity": "sha512-+hRyEfjRPFwTYMmSQ3/f7U9nP8ZNZmbkmUek760ZpxnCPWJIhaaLRuUSvpJ36fZKCGENxLwxClzwpOpnXNfChQ==",
|
||||
"requires": {
|
||||
"abab": "^2.0.0",
|
||||
"acorn": "^6.1.1",
|
||||
"acorn": "^7.1.0",
|
||||
"acorn-globals": "^4.3.2",
|
||||
"array-equal": "^1.0.0",
|
||||
"cssom": "^0.3.6",
|
||||
"cssstyle": "^1.2.2",
|
||||
"cssom": "^0.4.1",
|
||||
"cssstyle": "^2.0.0",
|
||||
"data-urls": "^1.1.0",
|
||||
"domexception": "^1.0.1",
|
||||
"escodegen": "^1.11.1",
|
||||
@@ -1913,16 +1912,23 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"cssom": {
|
||||
"version": "0.3.8",
|
||||
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.8.tgz",
|
||||
"integrity": "sha512-b0tGHbfegbhPJpxpiBPU2sCkigAqtM9O121le6bbOlgyV+NyGyCmVfJ6QW9eRjz8CpNfWEOYBIMIGRYkLwsIYg=="
|
||||
"version": "0.4.1",
|
||||
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.4.1.tgz",
|
||||
"integrity": "sha512-6Aajq0XmukE7HdXUU6IoSWuH1H6gH9z6qmagsstTiN7cW2FNTsb+J2Chs+ufPgZCsV/yo8oaEudQLrb9dGxSVQ=="
|
||||
},
|
||||
"cssstyle": {
|
||||
"version": "1.4.0",
|
||||
"resolved": "https://registry.npmjs.org/cssstyle/-/cssstyle-1.4.0.tgz",
|
||||
"integrity": "sha512-GBrLZYZ4X4x6/QEoBnIrqb8B/f5l4+8me2dkom/j1Gtbxy0kBv6OGzKuAsGM75bkGwGAFkt56Iwg28S3XTZgSA==",
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/cssstyle/-/cssstyle-2.0.0.tgz",
|
||||
"integrity": "sha512-QXSAu2WBsSRXCPjvI43Y40m6fMevvyRm8JVAuF9ksQz5jha4pWP1wpaK7Yu5oLFc6+XAY+hj8YhefyXcBB53gg==",
|
||||
"requires": {
|
||||
"cssom": "0.3.x"
|
||||
"cssom": "~0.3.6"
|
||||
},
|
||||
"dependencies": {
|
||||
"cssom": {
|
||||
"version": "0.3.8",
|
||||
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.8.tgz",
|
||||
"integrity": "sha512-b0tGHbfegbhPJpxpiBPU2sCkigAqtM9O121le6bbOlgyV+NyGyCmVfJ6QW9eRjz8CpNfWEOYBIMIGRYkLwsIYg=="
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1987,9 +1993,9 @@
|
||||
"integrity": "sha512-M4yMwr6mAnQz76TbJm914+gPpB/nCwvZbJU28cUD6dR004SAxDLOOSUaB1JDRqLtaOV/vi0IC5lEAGFgrjGv/g=="
|
||||
},
|
||||
"whatwg-url": {
|
||||
"version": "7.0.0",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-7.0.0.tgz",
|
||||
"integrity": "sha512-37GeVSIJ3kn1JgKyjiYNmSLP1yzbpb29jdmwBSgkD9h40/hyrR/OifpVUndji3tmwGgD8qpw7iQu3RSbCrBpsQ==",
|
||||
"version": "7.1.0",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-7.1.0.tgz",
|
||||
"integrity": "sha512-WUu7Rg1DroM7oQvGWfOiAK21n74Gg+T4elXEQYkOhtyLeWiJFoOGLXPKI/9gzIie9CtwVLm8wtw6YJdKyxSjeg==",
|
||||
"requires": {
|
||||
"lodash.sortby": "^4.7.0",
|
||||
"tr46": "^1.0.1",
|
||||
@@ -1997,9 +2003,9 @@
|
||||
}
|
||||
},
|
||||
"ws": {
|
||||
"version": "7.1.2",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.1.2.tgz",
|
||||
"integrity": "sha512-gftXq3XI81cJCgkUiAVixA0raD9IVmXqsylCrjRygw4+UOOGzPoxnQ6r/CnVL9i+mDncJo94tSkyrtuuQVBmrg==",
|
||||
"version": "7.2.0",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.2.0.tgz",
|
||||
"integrity": "sha512-+SqNqFbwTm/0DC18KYzIsMTnEWpLwJsiasW/O17la4iDRRIO9uaHbvKiAS3AHgTiuuWerK/brj4O6MYZkei9xg==",
|
||||
"requires": {
|
||||
"async-limiter": "^1.0.0"
|
||||
}
|
||||
@@ -2007,9 +2013,9 @@
|
||||
}
|
||||
},
|
||||
"@patternfly/react-tokens": {
|
||||
"version": "2.6.16",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.16.tgz",
|
||||
"integrity": "sha512-dhr4ne4thSmSKBr4anV07KSzUXEs6KpCMDxxNiwrgFdZwNHtyNcaPc+F9pQZ5A0n4qYmMLpCrprb7m5o/83riQ=="
|
||||
"version": "2.6.31",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.31.tgz",
|
||||
"integrity": "sha512-K9semfLIdf2vECefAbheXPVwZqq8nXY0Hf/VkWh6OBCL6R4FekxajpSBgobeoTQUotmvz5boMngqhkUjE7yChA=="
|
||||
},
|
||||
"@types/babel__core": {
|
||||
"version": "7.1.1",
|
||||
@@ -5308,9 +5314,9 @@
|
||||
}
|
||||
},
|
||||
"csstype": {
|
||||
"version": "2.6.6",
|
||||
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.6.tgz",
|
||||
"integrity": "sha512-RpFbQGUE74iyPgvr46U9t1xoQBM8T4BL8SxrN66Le2xYAPSaDJJKeztV3awugusb3g3G9iL8StmkBBXhcbbXhg=="
|
||||
"version": "2.6.7",
|
||||
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.7.tgz",
|
||||
"integrity": "sha512-9Mcn9sFbGBAdmimWb2gLVDtFJzeKtDGIr76TUqmjZrw9LFXBMSU70lcs+C0/7fyCd6iBDqmksUcCOUIkisPHsQ=="
|
||||
},
|
||||
"currently-unhandled": {
|
||||
"version": "0.4.1",
|
||||
@@ -12959,9 +12965,9 @@
|
||||
"dev": true
|
||||
},
|
||||
"popper.js": {
|
||||
"version": "1.15.0",
|
||||
"resolved": "https://registry.npmjs.org/popper.js/-/popper.js-1.15.0.tgz",
|
||||
"integrity": "sha512-w010cY1oCUmI+9KwwlWki+r5jxKfTFDVoadl7MSrIujHU5MJ5OR6HTDj6Xo8aoR/QsA56x8jKjA59qGH4ELtrA=="
|
||||
"version": "1.16.0",
|
||||
"resolved": "https://registry.npmjs.org/popper.js/-/popper.js-1.16.0.tgz",
|
||||
"integrity": "sha512-+G+EkOPoE5S/zChTpmBSSDYmhXJ5PsW8eMhH8cP/CQHMFPBG/kC9Y5IIw6qNYgdJ+/COf0ddY2li28iHaZRSjw=="
|
||||
},
|
||||
"portfinder": {
|
||||
"version": "1.0.20",
|
||||
@@ -17421,9 +17427,9 @@
|
||||
"integrity": "sha512-A5CUptxDsvxKJEU3yO6DuWBSJz/qizqzJKOMIfUJHETbBw/sFaDxgd6fxm1ewUaM0jZ444Fc5vC5ROYurg/4Pw=="
|
||||
},
|
||||
"xmlchars": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/xmlchars/-/xmlchars-2.1.1.tgz",
|
||||
"integrity": "sha512-7hew1RPJ1iIuje/Y01bGD/mXokXxegAgVS+e+E0wSi2ILHQkYAH1+JXARwTjZSM4Z4Z+c73aKspEcqj+zPPL/w=="
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/xmlchars/-/xmlchars-2.2.0.tgz",
|
||||
"integrity": "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw=="
|
||||
},
|
||||
"xregexp": {
|
||||
"version": "4.0.0",
|
||||
|
||||
@@ -57,10 +57,10 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@lingui/react": "^2.7.2",
|
||||
"@patternfly/patternfly": "^2.27.0",
|
||||
"@patternfly/react-core": "^3.87.3",
|
||||
"@patternfly/react-icons": "^3.12.0",
|
||||
"@patternfly/react-tokens": "^2.6.16",
|
||||
"@patternfly/patternfly": "^2.40.2",
|
||||
"@patternfly/react-core": "^3.120.2",
|
||||
"@patternfly/react-icons": "^3.14.15",
|
||||
"@patternfly/react-tokens": "^2.6.31",
|
||||
"ansi-to-html": "^0.6.11",
|
||||
"axios": "^0.18.1",
|
||||
"codemirror": "^5.47.0",
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import AdHocCommands from './models/AdHocCommands';
|
||||
import Config from './models/Config';
|
||||
import CredentialTypes from './models/CredentialTypes';
|
||||
import Credentials from './models/Credentials';
|
||||
import InstanceGroups from './models/InstanceGroups';
|
||||
import Inventories from './models/Inventories';
|
||||
import InventorySources from './models/InventorySources';
|
||||
@@ -23,6 +25,8 @@ import WorkflowJobTemplates from './models/WorkflowJobTemplates';
|
||||
|
||||
const AdHocCommandsAPI = new AdHocCommands();
|
||||
const ConfigAPI = new Config();
|
||||
const CredentialsAPI = new Credentials();
|
||||
const CredentialTypesAPI = new CredentialTypes();
|
||||
const InstanceGroupsAPI = new InstanceGroups();
|
||||
const InventoriesAPI = new Inventories();
|
||||
const InventorySourcesAPI = new InventorySources();
|
||||
@@ -47,6 +51,8 @@ const WorkflowJobTemplatesAPI = new WorkflowJobTemplates();
|
||||
export {
|
||||
AdHocCommandsAPI,
|
||||
ConfigAPI,
|
||||
CredentialsAPI,
|
||||
CredentialTypesAPI,
|
||||
InstanceGroupsAPI,
|
||||
InventoriesAPI,
|
||||
InventorySourcesAPI,
|
||||
|
||||
10
awx/ui_next/src/api/models/CredentialTypes.js
Normal file
10
awx/ui_next/src/api/models/CredentialTypes.js
Normal file
@@ -0,0 +1,10 @@
|
||||
import Base from '../Base';
|
||||
|
||||
class CredentialTypes extends Base {
|
||||
constructor(http) {
|
||||
super(http);
|
||||
this.baseUrl = '/api/v2/credential_types/';
|
||||
}
|
||||
}
|
||||
|
||||
export default CredentialTypes;
|
||||
10
awx/ui_next/src/api/models/Credentials.js
Normal file
10
awx/ui_next/src/api/models/Credentials.js
Normal file
@@ -0,0 +1,10 @@
|
||||
import Base from '../Base';
|
||||
|
||||
class Credentials extends Base {
|
||||
constructor(http) {
|
||||
super(http);
|
||||
this.baseUrl = '/api/v2/credentials/';
|
||||
}
|
||||
}
|
||||
|
||||
export default Credentials;
|
||||
@@ -4,6 +4,12 @@ class Inventories extends Base {
|
||||
constructor(http) {
|
||||
super(http);
|
||||
this.baseUrl = '/api/v2/inventories/';
|
||||
|
||||
this.readAccessList = this.readAccessList.bind(this);
|
||||
}
|
||||
|
||||
readAccessList(id, params) {
|
||||
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -12,6 +12,7 @@ class JobTemplates extends InstanceGroupsMixin(NotificationsMixin(Base)) {
|
||||
this.associateLabel = this.associateLabel.bind(this);
|
||||
this.disassociateLabel = this.disassociateLabel.bind(this);
|
||||
this.readCredentials = this.readCredentials.bind(this);
|
||||
this.readAccessList = this.readAccessList.bind(this);
|
||||
this.generateLabel = this.generateLabel.bind(this);
|
||||
}
|
||||
|
||||
@@ -44,6 +45,23 @@ class JobTemplates extends InstanceGroupsMixin(NotificationsMixin(Base)) {
|
||||
readCredentials(id, params) {
|
||||
return this.http.get(`${this.baseUrl}${id}/credentials/`, { params });
|
||||
}
|
||||
|
||||
associateCredentials(id, credentialId) {
|
||||
return this.http.post(`${this.baseUrl}${id}/credentials/`, {
|
||||
id: credentialId,
|
||||
});
|
||||
}
|
||||
|
||||
disassociateCredentials(id, credentialId) {
|
||||
return this.http.post(`${this.baseUrl}${id}/credentials/`, {
|
||||
id: credentialId,
|
||||
disassociate: true,
|
||||
});
|
||||
}
|
||||
|
||||
readAccessList(id, params) {
|
||||
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
|
||||
}
|
||||
}
|
||||
|
||||
export default JobTemplates;
|
||||
|
||||
@@ -1,16 +1,22 @@
|
||||
import Base from '../Base';
|
||||
import NotificationsMixin from '../mixins/Notifications.mixin';
|
||||
import LaunchUpdateMixin from '../mixins/LaunchUpdate.mixin';
|
||||
|
||||
class Projects extends LaunchUpdateMixin(Base) {
|
||||
class Projects extends LaunchUpdateMixin(NotificationsMixin(Base)) {
|
||||
constructor(http) {
|
||||
super(http);
|
||||
this.baseUrl = '/api/v2/projects/';
|
||||
|
||||
this.readAccessList = this.readAccessList.bind(this);
|
||||
this.readPlaybooks = this.readPlaybooks.bind(this);
|
||||
this.readSync = this.readSync.bind(this);
|
||||
this.sync = this.sync.bind(this);
|
||||
}
|
||||
|
||||
readAccessList(id, params) {
|
||||
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
|
||||
}
|
||||
|
||||
readPlaybooks(id) {
|
||||
return this.http.get(`${this.baseUrl}${id}/playbooks/`);
|
||||
}
|
||||
|
||||
@@ -0,0 +1,10 @@
|
||||
import DataListCell from '@components/DataListCell';
|
||||
import styled from 'styled-components';
|
||||
|
||||
const ActionButtonCell = styled(DataListCell)`
|
||||
& > :not(:first-child) {
|
||||
margin-left: 20px;
|
||||
}
|
||||
`;
|
||||
ActionButtonCell.displayName = 'ActionButtonCell';
|
||||
export default ActionButtonCell;
|
||||
@@ -0,0 +1,10 @@
|
||||
import React from 'react';
|
||||
import { mount } from 'enzyme';
|
||||
import ActionButtonCell from './ActionButtonCell';
|
||||
|
||||
describe('ActionButtonCell', () => {
|
||||
test('renders the expected content', () => {
|
||||
const wrapper = mount(<ActionButtonCell />);
|
||||
expect(wrapper).toHaveLength(1);
|
||||
});
|
||||
});
|
||||
1
awx/ui_next/src/components/ActionButtonCell/index.js
Normal file
1
awx/ui_next/src/components/ActionButtonCell/index.js
Normal file
@@ -0,0 +1 @@
|
||||
export { default } from './ActionButtonCell';
|
||||
@@ -0,0 +1,92 @@
|
||||
import React from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import { Button, Tooltip } from '@patternfly/react-core';
|
||||
import { CopyIcon } from '@patternfly/react-icons';
|
||||
|
||||
import styled from 'styled-components';
|
||||
|
||||
const CopyButton = styled(Button)`
|
||||
padding: 2px 4px;
|
||||
margin-left: 8px;
|
||||
border: none;
|
||||
&:hover {
|
||||
background-color: #0066cc;
|
||||
color: white;
|
||||
}
|
||||
`;
|
||||
|
||||
export const clipboardCopyFunc = (event, text) => {
|
||||
const clipboard = event.currentTarget.parentElement;
|
||||
const el = document.createElement('input');
|
||||
el.value = text;
|
||||
clipboard.appendChild(el);
|
||||
el.select();
|
||||
document.execCommand('copy');
|
||||
clipboard.removeChild(el);
|
||||
};
|
||||
|
||||
class ClipboardCopyButton extends React.Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
|
||||
this.state = {
|
||||
copied: false,
|
||||
};
|
||||
|
||||
this.handleCopyClick = this.handleCopyClick.bind(this);
|
||||
}
|
||||
|
||||
handleCopyClick = event => {
|
||||
const { stringToCopy, switchDelay } = this.props;
|
||||
if (this.timer) {
|
||||
window.clearTimeout(this.timer);
|
||||
this.setState({ copied: false });
|
||||
}
|
||||
clipboardCopyFunc(event, stringToCopy);
|
||||
this.setState({ copied: true }, () => {
|
||||
this.timer = window.setTimeout(() => {
|
||||
this.setState({ copied: false });
|
||||
this.timer = null;
|
||||
}, switchDelay);
|
||||
});
|
||||
};
|
||||
|
||||
render() {
|
||||
const { clickTip, entryDelay, exitDelay, hoverTip } = this.props;
|
||||
const { copied } = this.state;
|
||||
|
||||
return (
|
||||
<Tooltip
|
||||
entryDelay={entryDelay}
|
||||
exitDelay={exitDelay}
|
||||
trigger="mouseenter focus click"
|
||||
content={copied ? clickTip : hoverTip}
|
||||
>
|
||||
<CopyButton
|
||||
variant="plain"
|
||||
onClick={this.handleCopyClick}
|
||||
aria-label={hoverTip}
|
||||
>
|
||||
<CopyIcon />
|
||||
</CopyButton>
|
||||
</Tooltip>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
ClipboardCopyButton.propTypes = {
|
||||
clickTip: PropTypes.string.isRequired,
|
||||
entryDelay: PropTypes.number,
|
||||
exitDelay: PropTypes.number,
|
||||
hoverTip: PropTypes.string.isRequired,
|
||||
stringToCopy: PropTypes.string.isRequired,
|
||||
switchDelay: PropTypes.number,
|
||||
};
|
||||
|
||||
ClipboardCopyButton.defaultProps = {
|
||||
entryDelay: 100,
|
||||
exitDelay: 1600,
|
||||
switchDelay: 2000,
|
||||
};
|
||||
|
||||
export default ClipboardCopyButton;
|
||||
@@ -0,0 +1,36 @@
|
||||
import React from 'react';
|
||||
import { mountWithContexts } from '@testUtils/enzymeHelpers';
|
||||
import ClipboardCopyButton from './ClipboardCopyButton';
|
||||
|
||||
document.execCommand = jest.fn();
|
||||
|
||||
jest.useFakeTimers();
|
||||
|
||||
describe('ClipboardCopyButton', () => {
|
||||
test('renders the expected content', () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<ClipboardCopyButton
|
||||
clickTip="foo"
|
||||
hoverTip="bar"
|
||||
stringToCopy="foobar!"
|
||||
/>
|
||||
);
|
||||
expect(wrapper).toHaveLength(1);
|
||||
});
|
||||
test('clicking button calls execCommand to copy to clipboard', () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<ClipboardCopyButton
|
||||
clickTip="foo"
|
||||
hoverTip="bar"
|
||||
stringToCopy="foobar!"
|
||||
/>
|
||||
).find('ClipboardCopyButton');
|
||||
expect(wrapper.state('copied')).toBe(false);
|
||||
wrapper.find('Button').simulate('click');
|
||||
expect(document.execCommand).toBeCalledWith('copy');
|
||||
expect(wrapper.state('copied')).toBe(true);
|
||||
jest.runAllTimers();
|
||||
wrapper.update();
|
||||
expect(wrapper.state('copied')).toBe(false);
|
||||
});
|
||||
});
|
||||
1
awx/ui_next/src/components/ClipboardCopyButton/index.js
Normal file
1
awx/ui_next/src/components/ClipboardCopyButton/index.js
Normal file
@@ -0,0 +1 @@
|
||||
export { default } from './ClipboardCopyButton';
|
||||
@@ -6,6 +6,9 @@ import { AngleRightIcon } from '@patternfly/react-icons';
|
||||
import omitProps from '@util/omitProps';
|
||||
import ExpandingContainer from './ExpandingContainer';
|
||||
|
||||
// Make button findable by tests
|
||||
Button.displayName = 'Button';
|
||||
|
||||
const Toggle = styled.div`
|
||||
display: flex;
|
||||
|
||||
|
||||
@@ -103,9 +103,10 @@ describe('<DataListToolbar />', () => {
|
||||
let searchDropdownItems = toolbar.find(searchDropdownMenuItems).children();
|
||||
expect(searchDropdownItems.length).toBe(1);
|
||||
const mockedSortEvent = { target: { innerText: 'Bar' } };
|
||||
sortDropdownItems.at(0).simulate('click', mockedSortEvent);
|
||||
searchDropdownItems.at(0).simulate('click', mockedSortEvent);
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="descending"
|
||||
columns={multipleColumns}
|
||||
@@ -170,6 +171,7 @@ describe('<DataListToolbar />', () => {
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="id"
|
||||
sortOrder="ascending"
|
||||
columns={numericColumns}
|
||||
@@ -236,6 +238,7 @@ describe('<DataListToolbar />', () => {
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
isAllSelected
|
||||
showExpandCollapse
|
||||
sortedColumnKey="name"
|
||||
|
||||
@@ -10,7 +10,16 @@ const QuestionCircleIcon = styled(PFQuestionCircleIcon)`
|
||||
`;
|
||||
|
||||
function FormField(props) {
|
||||
const { id, name, label, tooltip, validate, isRequired, ...rest } = props;
|
||||
const {
|
||||
id,
|
||||
name,
|
||||
label,
|
||||
tooltip,
|
||||
tooltipMaxWidth,
|
||||
validate,
|
||||
isRequired,
|
||||
...rest
|
||||
} = props;
|
||||
|
||||
return (
|
||||
<Field
|
||||
@@ -29,7 +38,11 @@ function FormField(props) {
|
||||
label={label}
|
||||
>
|
||||
{tooltip && (
|
||||
<Tooltip position="right" content={tooltip}>
|
||||
<Tooltip
|
||||
content={tooltip}
|
||||
maxWidth={tooltipMaxWidth}
|
||||
position="right"
|
||||
>
|
||||
<QuestionCircleIcon />
|
||||
</Tooltip>
|
||||
)}
|
||||
@@ -58,6 +71,7 @@ FormField.propTypes = {
|
||||
validate: PropTypes.func,
|
||||
isRequired: PropTypes.bool,
|
||||
tooltip: PropTypes.node,
|
||||
tooltipMaxWidth: PropTypes.string,
|
||||
};
|
||||
|
||||
FormField.defaultProps = {
|
||||
@@ -65,6 +79,7 @@ FormField.defaultProps = {
|
||||
validate: () => {},
|
||||
isRequired: false,
|
||||
tooltip: null,
|
||||
tooltipMaxWidth: '',
|
||||
};
|
||||
|
||||
export default FormField;
|
||||
|
||||
@@ -6,6 +6,6 @@ const Row = styled.div`
|
||||
grid-gap: 20px;
|
||||
grid-template-columns: repeat(auto-fit, minmax(300px, 1fr));
|
||||
`;
|
||||
export default function FormRow({ children }) {
|
||||
return <Row>{children}</Row>;
|
||||
export default function FormRow({ children, className }) {
|
||||
return <Row className={className}>{children}</Row>;
|
||||
}
|
||||
|
||||
68
awx/ui_next/src/components/Lookup/CredentialLookup.jsx
Normal file
68
awx/ui_next/src/components/Lookup/CredentialLookup.jsx
Normal file
@@ -0,0 +1,68 @@
|
||||
import React from 'react';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { bool, func, number, string, oneOfType } from 'prop-types';
|
||||
import { CredentialsAPI } from '@api';
|
||||
import { Credential } from '@types';
|
||||
import { mergeParams } from '@util/qs';
|
||||
import { FormGroup } from '@patternfly/react-core';
|
||||
import Lookup from '@components/Lookup';
|
||||
|
||||
function CredentialLookup({
|
||||
helperTextInvalid,
|
||||
label,
|
||||
isValid,
|
||||
onBlur,
|
||||
onChange,
|
||||
required,
|
||||
credentialTypeId,
|
||||
value,
|
||||
}) {
|
||||
const getCredentials = async params =>
|
||||
CredentialsAPI.read(
|
||||
mergeParams(params, { credential_type: credentialTypeId })
|
||||
);
|
||||
|
||||
return (
|
||||
<FormGroup
|
||||
fieldId="credential"
|
||||
isRequired={required}
|
||||
isValid={isValid}
|
||||
label={label}
|
||||
helperTextInvalid={helperTextInvalid}
|
||||
>
|
||||
<Lookup
|
||||
id="credential"
|
||||
lookupHeader={label}
|
||||
name="credential"
|
||||
value={value}
|
||||
onBlur={onBlur}
|
||||
onLookupSave={onChange}
|
||||
getItems={getCredentials}
|
||||
required={required}
|
||||
sortedColumnKey="name"
|
||||
/>
|
||||
</FormGroup>
|
||||
);
|
||||
}
|
||||
|
||||
CredentialLookup.propTypes = {
|
||||
credentialTypeId: oneOfType([number, string]).isRequired,
|
||||
helperTextInvalid: string,
|
||||
isValid: bool,
|
||||
label: string.isRequired,
|
||||
onBlur: func,
|
||||
onChange: func.isRequired,
|
||||
required: bool,
|
||||
value: Credential,
|
||||
};
|
||||
|
||||
CredentialLookup.defaultProps = {
|
||||
helperTextInvalid: '',
|
||||
isValid: true,
|
||||
onBlur: () => {},
|
||||
required: false,
|
||||
value: null,
|
||||
};
|
||||
|
||||
export { CredentialLookup as _CredentialLookup };
|
||||
export default withI18n()(CredentialLookup);
|
||||
41
awx/ui_next/src/components/Lookup/CredentialLookup.test.jsx
Normal file
41
awx/ui_next/src/components/Lookup/CredentialLookup.test.jsx
Normal file
@@ -0,0 +1,41 @@
|
||||
import React from 'react';
|
||||
import { mountWithContexts } from '@testUtils/enzymeHelpers';
|
||||
import CredentialLookup, { _CredentialLookup } from './CredentialLookup';
|
||||
import { CredentialsAPI } from '@api';
|
||||
|
||||
jest.mock('@api');
|
||||
|
||||
describe('CredentialLookup', () => {
|
||||
let wrapper;
|
||||
|
||||
beforeEach(() => {
|
||||
wrapper = mountWithContexts(
|
||||
<CredentialLookup credentialTypeId={1} label="Foo" onChange={() => {}} />
|
||||
);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
|
||||
test('initially renders successfully', () => {
|
||||
expect(wrapper.find('CredentialLookup')).toHaveLength(1);
|
||||
});
|
||||
test('should fetch credentials', () => {
|
||||
expect(CredentialsAPI.read).toHaveBeenCalledTimes(1);
|
||||
expect(CredentialsAPI.read).toHaveBeenCalledWith({
|
||||
credential_type: 1,
|
||||
order_by: 'name',
|
||||
page: 1,
|
||||
page_size: 5,
|
||||
});
|
||||
});
|
||||
test('should display label', () => {
|
||||
const title = wrapper.find('FormGroup .pf-c-form__label-text');
|
||||
expect(title.text()).toEqual('Foo');
|
||||
});
|
||||
test('should define default value for function props', () => {
|
||||
expect(_CredentialLookup.defaultProps.onBlur).toBeInstanceOf(Function);
|
||||
expect(_CredentialLookup.defaultProps.onBlur).not.toThrow();
|
||||
});
|
||||
});
|
||||
@@ -15,16 +15,19 @@ import {
|
||||
ButtonVariant,
|
||||
InputGroup as PFInputGroup,
|
||||
Modal,
|
||||
ToolbarItem,
|
||||
} from '@patternfly/react-core';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import styled from 'styled-components';
|
||||
|
||||
import AnsibleSelect from '../AnsibleSelect';
|
||||
import PaginatedDataList from '../PaginatedDataList';
|
||||
import VerticalSeperator from '../VerticalSeparator';
|
||||
import DataListToolbar from '../DataListToolbar';
|
||||
import CheckboxListItem from '../CheckboxListItem';
|
||||
import SelectedList from '../SelectedList';
|
||||
import { ChipGroup, Chip } from '../Chip';
|
||||
import { ChipGroup, Chip, CredentialChip } from '../Chip';
|
||||
import { getQSConfig, parseQueryString } from '../../util/qs';
|
||||
|
||||
const SearchButton = styled(Button)`
|
||||
@@ -83,14 +86,20 @@ class Lookup extends React.Component {
|
||||
}
|
||||
|
||||
componentDidUpdate(prevProps) {
|
||||
const { location } = this.props;
|
||||
if (location !== prevProps.location) {
|
||||
const { location, selectedCategory } = this.props;
|
||||
if (
|
||||
location !== prevProps.location ||
|
||||
prevProps.selectedCategory !== selectedCategory
|
||||
) {
|
||||
this.getData();
|
||||
}
|
||||
}
|
||||
|
||||
assertCorrectValueType() {
|
||||
const { multiple, value } = this.props;
|
||||
const { multiple, value, selectCategoryOptions } = this.props;
|
||||
if (selectCategoryOptions) {
|
||||
return;
|
||||
}
|
||||
if (!multiple && Array.isArray(value)) {
|
||||
throw new Error(
|
||||
'Lookup value must not be an array unless `multiple` is set'
|
||||
@@ -123,7 +132,13 @@ class Lookup extends React.Component {
|
||||
}
|
||||
|
||||
toggleSelected(row) {
|
||||
const { name, onLookupSave, multiple } = this.props;
|
||||
const {
|
||||
name,
|
||||
onLookupSave,
|
||||
multiple,
|
||||
onToggleItem,
|
||||
selectCategoryOptions,
|
||||
} = this.props;
|
||||
const {
|
||||
lookupSelectedItems: updatedSelectedItems,
|
||||
isModalOpen,
|
||||
@@ -132,8 +147,10 @@ class Lookup extends React.Component {
|
||||
const selectedIndex = updatedSelectedItems.findIndex(
|
||||
selectedRow => selectedRow.id === row.id
|
||||
);
|
||||
|
||||
if (multiple) {
|
||||
if (selectCategoryOptions) {
|
||||
onToggleItem(row, isModalOpen);
|
||||
}
|
||||
if (selectedIndex > -1) {
|
||||
updatedSelectedItems.splice(selectedIndex, 1);
|
||||
this.setState({ lookupSelectedItems: updatedSelectedItems });
|
||||
@@ -156,7 +173,7 @@ class Lookup extends React.Component {
|
||||
|
||||
handleModalToggle() {
|
||||
const { isModalOpen } = this.state;
|
||||
const { value, multiple } = this.props;
|
||||
const { value, multiple, selectCategory } = this.props;
|
||||
// Resets the selected items from parent state whenever modal is opened
|
||||
// This handles the case where the user closes/cancels the modal and
|
||||
// opens it again
|
||||
@@ -168,6 +185,9 @@ class Lookup extends React.Component {
|
||||
this.setState({ lookupSelectedItems });
|
||||
} else {
|
||||
this.clearQSParams();
|
||||
if (selectCategory) {
|
||||
selectCategory(null, 'Machine');
|
||||
}
|
||||
}
|
||||
this.setState(prevState => ({
|
||||
isModalOpen: !prevState.isModalOpen,
|
||||
@@ -180,8 +200,9 @@ class Lookup extends React.Component {
|
||||
const value = multiple
|
||||
? lookupSelectedItems
|
||||
: lookupSelectedItems[0] || null;
|
||||
onLookupSave(value, name);
|
||||
|
||||
this.handleModalToggle();
|
||||
onLookupSave(value, name);
|
||||
}
|
||||
|
||||
clearQSParams() {
|
||||
@@ -201,6 +222,7 @@ class Lookup extends React.Component {
|
||||
count,
|
||||
} = this.state;
|
||||
const {
|
||||
form,
|
||||
id,
|
||||
lookupHeader,
|
||||
value,
|
||||
@@ -208,27 +230,40 @@ class Lookup extends React.Component {
|
||||
multiple,
|
||||
name,
|
||||
onBlur,
|
||||
selectCategory,
|
||||
required,
|
||||
i18n,
|
||||
selectCategoryOptions,
|
||||
selectedCategory,
|
||||
} = this.props;
|
||||
|
||||
const header = lookupHeader || i18n._(t`Items`);
|
||||
const canDelete = !required || (multiple && value.length > 1);
|
||||
|
||||
const chips = value ? (
|
||||
<ChipGroup>
|
||||
{(multiple ? value : [value]).map(chip => (
|
||||
<Chip
|
||||
key={chip.id}
|
||||
onClick={() => this.toggleSelected(chip)}
|
||||
isReadOnly={!canDelete}
|
||||
>
|
||||
{chip.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
) : null;
|
||||
|
||||
const chips = () => {
|
||||
return selectCategoryOptions && selectCategoryOptions.length > 0 ? (
|
||||
<ChipGroup>
|
||||
{(multiple ? value : [value]).map(chip => (
|
||||
<CredentialChip
|
||||
key={chip.id}
|
||||
onClick={() => this.toggleSelected(chip)}
|
||||
isReadOnly={!canDelete}
|
||||
credential={chip}
|
||||
/>
|
||||
))}
|
||||
</ChipGroup>
|
||||
) : (
|
||||
<ChipGroup>
|
||||
{(multiple ? value : [value]).map(chip => (
|
||||
<Chip
|
||||
key={chip.id}
|
||||
onClick={() => this.toggleSelected(chip)}
|
||||
isReadOnly={!canDelete}
|
||||
>
|
||||
{chip.name}
|
||||
</Chip>
|
||||
))}
|
||||
</ChipGroup>
|
||||
);
|
||||
};
|
||||
return (
|
||||
<Fragment>
|
||||
<InputGroup onBlur={onBlur}>
|
||||
@@ -240,7 +275,9 @@ class Lookup extends React.Component {
|
||||
>
|
||||
<SearchIcon />
|
||||
</SearchButton>
|
||||
<ChipHolder className="pf-c-form-control">{chips}</ChipHolder>
|
||||
<ChipHolder className="pf-c-form-control">
|
||||
{value ? chips(value) : null}
|
||||
</ChipHolder>
|
||||
</InputGroup>
|
||||
<Modal
|
||||
className="awx-c-modal"
|
||||
@@ -265,6 +302,21 @@ class Lookup extends React.Component {
|
||||
</Button>,
|
||||
]}
|
||||
>
|
||||
{selectCategoryOptions && selectCategoryOptions.length > 0 && (
|
||||
<ToolbarItem css=" display: flex; align-items: center;">
|
||||
<span css="flex: 0 0 25%;">Selected Category</span>
|
||||
<VerticalSeperator />
|
||||
<AnsibleSelect
|
||||
css="flex: 1 1 75%;"
|
||||
id="multiCredentialsLookUp-select"
|
||||
label="Selected Category"
|
||||
data={selectCategoryOptions}
|
||||
value={selectedCategory.label}
|
||||
onChange={selectCategory}
|
||||
form={form}
|
||||
/>
|
||||
</ToolbarItem>
|
||||
)}
|
||||
<PaginatedDataList
|
||||
items={results}
|
||||
itemCount={count}
|
||||
@@ -277,9 +329,18 @@ class Lookup extends React.Component {
|
||||
itemId={item.id}
|
||||
name={multiple ? item.name : name}
|
||||
label={item.name}
|
||||
isSelected={lookupSelectedItems.some(i => i.id === item.id)}
|
||||
isSelected={
|
||||
selectCategoryOptions
|
||||
? value.some(i => i.id === item.id)
|
||||
: lookupSelectedItems.some(i => i.id === item.id)
|
||||
}
|
||||
onSelect={() => this.toggleSelected(item)}
|
||||
isRadio={!multiple}
|
||||
isRadio={
|
||||
!multiple ||
|
||||
(selectCategoryOptions &&
|
||||
selectCategoryOptions.length &&
|
||||
selectedCategory.value !== 'Vault')
|
||||
}
|
||||
/>
|
||||
)}
|
||||
renderToolbar={props => <DataListToolbar {...props} fillWidth />}
|
||||
@@ -288,10 +349,13 @@ class Lookup extends React.Component {
|
||||
{lookupSelectedItems.length > 0 && (
|
||||
<SelectedList
|
||||
label={i18n._(t`Selected`)}
|
||||
selected={lookupSelectedItems}
|
||||
selected={selectCategoryOptions ? value : lookupSelectedItems}
|
||||
showOverflowAfter={5}
|
||||
onRemove={this.toggleSelected}
|
||||
isReadOnly={!canDelete}
|
||||
isCredentialList={
|
||||
selectCategoryOptions && selectCategoryOptions.length > 0
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{error ? <div>error</div> : ''}
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user