mirror of
https://github.com/ansible/awx.git
synced 2026-03-21 10:57:36 -02:30
merge and resolve conflicts
This commit is contained in:
@@ -2,7 +2,8 @@
|
|||||||
|
|
||||||
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
|
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
|
||||||
|
|
||||||
## 12.0.0 (TBD)
|
## 12.0.0 (Jun 9, 2020)
|
||||||
|
- Removed memcached as a dependency of AWX (https://github.com/ansible/awx/pull/7240)
|
||||||
- Moved to a single container image build instead of separate awx_web and awx_task images. The container image is just `awx` (https://github.com/ansible/awx/pull/7228)
|
- Moved to a single container image build instead of separate awx_web and awx_task images. The container image is just `awx` (https://github.com/ansible/awx/pull/7228)
|
||||||
- Official AWX container image builds now use a two-stage container build process that notably reduces the size of our published images (https://github.com/ansible/awx/pull/7017)
|
- Official AWX container image builds now use a two-stage container build process that notably reduces the size of our published images (https://github.com/ansible/awx/pull/7017)
|
||||||
- Removed support for HipChat notifications ([EoL announcement](https://www.atlassian.com/partnerships/slack/faq#faq-98b17ca3-247f-423b-9a78-70a91681eff0)); all previously-created HipChat notification templates will be deleted due to this removal.
|
- Removed support for HipChat notifications ([EoL announcement](https://www.atlassian.com/partnerships/slack/faq#faq-98b17ca3-247f-423b-9a78-70a91681eff0)); all previously-created HipChat notification templates will be deleted due to this removal.
|
||||||
@@ -11,6 +12,9 @@ This is a list of high-level changes for each release of AWX. A full list of com
|
|||||||
- Fixed a bug that caused CyberArk AIM credential plugin looks to hang forever in some environments (https://github.com/ansible/awx/issues/6986)
|
- Fixed a bug that caused CyberArk AIM credential plugin looks to hang forever in some environments (https://github.com/ansible/awx/issues/6986)
|
||||||
- Fixed a bug that caused ANY/ALL converage settings not to properly save when editing approval nodes in the UI (https://github.com/ansible/awx/issues/6998)
|
- Fixed a bug that caused ANY/ALL converage settings not to properly save when editing approval nodes in the UI (https://github.com/ansible/awx/issues/6998)
|
||||||
- Fixed a bug that broke support for the satellite6_group_prefix source variable (https://github.com/ansible/awx/issues/7031)
|
- Fixed a bug that broke support for the satellite6_group_prefix source variable (https://github.com/ansible/awx/issues/7031)
|
||||||
|
- Fixed a bug that prevented changes to workflow node convergence settings when approval nodes were in use (https://github.com/ansible/awx/issues/7063)
|
||||||
|
- Fixed a bug that caused notifications to fail on newer version of Mattermost (https://github.com/ansible/awx/issues/7264)
|
||||||
|
- Fixed a bug (by upgrading to 0.8.1 of the foreman collection) that prevented host_filters from working properly with Foreman-based inventory (https://github.com/ansible/awx/issues/7225)
|
||||||
- Fixed a bug that prevented the usage of the Conjur credential plugin with secrets that contain spaces (https://github.com/ansible/awx/issues/7191)
|
- Fixed a bug that prevented the usage of the Conjur credential plugin with secrets that contain spaces (https://github.com/ansible/awx/issues/7191)
|
||||||
- Fixed a bug in awx-manage run_wsbroadcast --status in kubernetes (https://github.com/ansible/awx/pull/7009)
|
- Fixed a bug in awx-manage run_wsbroadcast --status in kubernetes (https://github.com/ansible/awx/pull/7009)
|
||||||
- Fixed a bug that broke notification toggles for system jobs in the UI (https://github.com/ansible/awx/pull/7042)
|
- Fixed a bug that broke notification toggles for system jobs in the UI (https://github.com/ansible/awx/pull/7042)
|
||||||
|
|||||||
@@ -157,8 +157,7 @@ If you start a second terminal session, you can take a look at the running conta
|
|||||||
$ docker ps
|
$ docker ps
|
||||||
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
|
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
|
||||||
44251b476f98 gcr.io/ansible-tower-engineering/awx_devel:devel "/entrypoint.sh /bin…" 27 seconds ago Up 23 seconds 0.0.0.0:6899->6899/tcp, 0.0.0.0:7899-7999->7899-7999/tcp, 0.0.0.0:8013->8013/tcp, 0.0.0.0:8043->8043/tcp, 0.0.0.0:8080->8080/tcp, 22/tcp, 0.0.0.0:8888->8888/tcp tools_awx_run_9e820694d57e
|
44251b476f98 gcr.io/ansible-tower-engineering/awx_devel:devel "/entrypoint.sh /bin…" 27 seconds ago Up 23 seconds 0.0.0.0:6899->6899/tcp, 0.0.0.0:7899-7999->7899-7999/tcp, 0.0.0.0:8013->8013/tcp, 0.0.0.0:8043->8043/tcp, 0.0.0.0:8080->8080/tcp, 22/tcp, 0.0.0.0:8888->8888/tcp tools_awx_run_9e820694d57e
|
||||||
b049a43817b4 memcached:alpine "docker-entrypoint.s…" 28 seconds ago Up 26 seconds 0.0.0.0:11211->11211/tcp tools_memcached_1
|
40de380e3c2e redis:latest "docker-entrypoint.s…" 28 seconds ago Up 26 seconds
|
||||||
40de380e3c2e redis:latest "docker-entrypoint.s…" 28 seconds ago Up 26 seconds 0.0.0.0:6379->6379/tcp tools_redis_1
|
|
||||||
b66a506d3007 postgres:10 "docker-entrypoint.s…" 28 seconds ago Up 26 seconds 0.0.0.0:5432->5432/tcp tools_postgres_1
|
b66a506d3007 postgres:10 "docker-entrypoint.s…" 28 seconds ago Up 26 seconds 0.0.0.0:5432->5432/tcp tools_postgres_1
|
||||||
```
|
```
|
||||||
**NOTE**
|
**NOTE**
|
||||||
|
|||||||
@@ -3600,7 +3600,7 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
|
|||||||
ujt = self.instance.unified_job_template
|
ujt = self.instance.unified_job_template
|
||||||
if ujt is None:
|
if ujt is None:
|
||||||
ret = {}
|
ret = {}
|
||||||
for fd in ('workflow_job_template', 'identifier'):
|
for fd in ('workflow_job_template', 'identifier', 'all_parents_must_converge'):
|
||||||
if fd in attrs:
|
if fd in attrs:
|
||||||
ret[fd] = attrs[fd]
|
ret[fd] = attrs[fd]
|
||||||
return ret
|
return ret
|
||||||
|
|||||||
@@ -35,6 +35,7 @@ class WorkerSignalHandler:
|
|||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.kill_now = False
|
self.kill_now = False
|
||||||
|
signal.signal(signal.SIGTERM, signal.SIG_DFL)
|
||||||
signal.signal(signal.SIGINT, self.exit_gracefully)
|
signal.signal(signal.SIGINT, self.exit_gracefully)
|
||||||
|
|
||||||
def exit_gracefully(self, *args, **kwargs):
|
def exit_gracefully(self, *args, **kwargs):
|
||||||
|
|||||||
@@ -16,31 +16,24 @@ class InstanceNotFound(Exception):
|
|||||||
super(InstanceNotFound, self).__init__(*args, **kwargs)
|
super(InstanceNotFound, self).__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class RegisterQueue:
|
||||||
|
def __init__(self, queuename, controller, instance_percent, inst_min, hostname_list):
|
||||||
|
self.instance_not_found_err = None
|
||||||
|
self.queuename = queuename
|
||||||
|
self.controller = controller
|
||||||
|
self.instance_percent = instance_percent
|
||||||
|
self.instance_min = inst_min
|
||||||
|
self.hostname_list = hostname_list
|
||||||
|
|
||||||
def add_arguments(self, parser):
|
def get_create_update_instance_group(self):
|
||||||
parser.add_argument('--queuename', dest='queuename', type=str,
|
|
||||||
help='Queue to create/update')
|
|
||||||
parser.add_argument('--hostnames', dest='hostnames', type=str,
|
|
||||||
help='Comma-Delimited Hosts to add to the Queue (will not remove already assigned instances)')
|
|
||||||
parser.add_argument('--controller', dest='controller', type=str,
|
|
||||||
default='', help='The controlling group (makes this an isolated group)')
|
|
||||||
parser.add_argument('--instance_percent', dest='instance_percent', type=int, default=0,
|
|
||||||
help='The percentage of active instances that will be assigned to this group'),
|
|
||||||
parser.add_argument('--instance_minimum', dest='instance_minimum', type=int, default=0,
|
|
||||||
help='The minimum number of instance that will be retained for this group from available instances')
|
|
||||||
|
|
||||||
|
|
||||||
def get_create_update_instance_group(self, queuename, instance_percent, instance_min):
|
|
||||||
created = False
|
created = False
|
||||||
changed = False
|
changed = False
|
||||||
|
(ig, created) = InstanceGroup.objects.get_or_create(name=self.queuename)
|
||||||
(ig, created) = InstanceGroup.objects.get_or_create(name=queuename)
|
if ig.policy_instance_percentage != self.instance_percent:
|
||||||
if ig.policy_instance_percentage != instance_percent:
|
ig.policy_instance_percentage = self.instance_percent
|
||||||
ig.policy_instance_percentage = instance_percent
|
|
||||||
changed = True
|
changed = True
|
||||||
if ig.policy_instance_minimum != instance_min:
|
if ig.policy_instance_minimum != self.instance_min:
|
||||||
ig.policy_instance_minimum = instance_min
|
ig.policy_instance_minimum = self.instance_min
|
||||||
changed = True
|
changed = True
|
||||||
|
|
||||||
if changed:
|
if changed:
|
||||||
@@ -48,12 +41,12 @@ class Command(BaseCommand):
|
|||||||
|
|
||||||
return (ig, created, changed)
|
return (ig, created, changed)
|
||||||
|
|
||||||
def update_instance_group_controller(self, ig, controller):
|
def update_instance_group_controller(self, ig):
|
||||||
changed = False
|
changed = False
|
||||||
control_ig = None
|
control_ig = None
|
||||||
|
|
||||||
if controller:
|
if self.controller:
|
||||||
control_ig = InstanceGroup.objects.filter(name=controller).first()
|
control_ig = InstanceGroup.objects.filter(name=self.controller).first()
|
||||||
|
|
||||||
if control_ig and ig.controller_id != control_ig.pk:
|
if control_ig and ig.controller_id != control_ig.pk:
|
||||||
ig.controller = control_ig
|
ig.controller = control_ig
|
||||||
@@ -62,10 +55,10 @@ class Command(BaseCommand):
|
|||||||
|
|
||||||
return (control_ig, changed)
|
return (control_ig, changed)
|
||||||
|
|
||||||
def add_instances_to_group(self, ig, hostname_list):
|
def add_instances_to_group(self, ig):
|
||||||
changed = False
|
changed = False
|
||||||
|
|
||||||
instance_list_unique = set([x.strip() for x in hostname_list if x])
|
instance_list_unique = set([x.strip() for x in self.hostname_list if x])
|
||||||
instances = []
|
instances = []
|
||||||
for inst_name in instance_list_unique:
|
for inst_name in instance_list_unique:
|
||||||
instance = Instance.objects.filter(hostname=inst_name)
|
instance = Instance.objects.filter(hostname=inst_name)
|
||||||
@@ -86,43 +79,61 @@ class Command(BaseCommand):
|
|||||||
|
|
||||||
return (instances, changed)
|
return (instances, changed)
|
||||||
|
|
||||||
def handle(self, **options):
|
def register(self):
|
||||||
instance_not_found_err = None
|
|
||||||
queuename = options.get('queuename')
|
|
||||||
if not queuename:
|
|
||||||
raise CommandError("Specify `--queuename` to use this command.")
|
|
||||||
ctrl = options.get('controller')
|
|
||||||
inst_per = options.get('instance_percent')
|
|
||||||
inst_min = options.get('instance_minimum')
|
|
||||||
hostname_list = []
|
|
||||||
if options.get('hostnames'):
|
|
||||||
hostname_list = options.get('hostnames').split(",")
|
|
||||||
|
|
||||||
with advisory_lock('cluster_policy_lock'):
|
with advisory_lock('cluster_policy_lock'):
|
||||||
with transaction.atomic():
|
with transaction.atomic():
|
||||||
changed2 = False
|
changed2 = False
|
||||||
changed3 = False
|
changed3 = False
|
||||||
(ig, created, changed1) = self.get_create_update_instance_group(queuename, inst_per, inst_min)
|
(ig, created, changed1) = self.get_create_update_instance_group()
|
||||||
if created:
|
if created:
|
||||||
print("Creating instance group {}".format(ig.name))
|
print("Creating instance group {}".format(ig.name))
|
||||||
elif not created:
|
elif not created:
|
||||||
print("Instance Group already registered {}".format(ig.name))
|
print("Instance Group already registered {}".format(ig.name))
|
||||||
|
|
||||||
if ctrl:
|
if self.controller:
|
||||||
(ig_ctrl, changed2) = self.update_instance_group_controller(ig, ctrl)
|
(ig_ctrl, changed2) = self.update_instance_group_controller(ig)
|
||||||
if changed2:
|
if changed2:
|
||||||
print("Set controller group {} on {}.".format(ctrl, queuename))
|
print("Set controller group {} on {}.".format(self.controller, self.queuename))
|
||||||
|
|
||||||
try:
|
try:
|
||||||
(instances, changed3) = self.add_instances_to_group(ig, hostname_list)
|
(instances, changed3) = self.add_instances_to_group(ig)
|
||||||
for i in instances:
|
for i in instances:
|
||||||
print("Added instance {} to {}".format(i.hostname, ig.name))
|
print("Added instance {} to {}".format(i.hostname, ig.name))
|
||||||
except InstanceNotFound as e:
|
except InstanceNotFound as e:
|
||||||
instance_not_found_err = e
|
self.instance_not_found_err = e
|
||||||
|
|
||||||
if any([changed1, changed2, changed3]):
|
if any([changed1, changed2, changed3]):
|
||||||
print('(changed: True)')
|
print('(changed: True)')
|
||||||
|
|
||||||
if instance_not_found_err:
|
|
||||||
print(instance_not_found_err.message)
|
class Command(BaseCommand):
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument('--queuename', dest='queuename', type=str,
|
||||||
|
help='Queue to create/update')
|
||||||
|
parser.add_argument('--hostnames', dest='hostnames', type=str,
|
||||||
|
help='Comma-Delimited Hosts to add to the Queue (will not remove already assigned instances)')
|
||||||
|
parser.add_argument('--controller', dest='controller', type=str,
|
||||||
|
default='', help='The controlling group (makes this an isolated group)')
|
||||||
|
parser.add_argument('--instance_percent', dest='instance_percent', type=int, default=0,
|
||||||
|
help='The percentage of active instances that will be assigned to this group'),
|
||||||
|
parser.add_argument('--instance_minimum', dest='instance_minimum', type=int, default=0,
|
||||||
|
help='The minimum number of instance that will be retained for this group from available instances')
|
||||||
|
|
||||||
|
|
||||||
|
def handle(self, **options):
|
||||||
|
queuename = options.get('queuename')
|
||||||
|
if not queuename:
|
||||||
|
raise CommandError("Specify `--queuename` to use this command.")
|
||||||
|
ctrl = options.get('controller')
|
||||||
|
inst_per = options.get('instance_percent')
|
||||||
|
instance_min = options.get('instance_minimum')
|
||||||
|
hostname_list = []
|
||||||
|
if options.get('hostnames'):
|
||||||
|
hostname_list = options.get('hostnames').split(",")
|
||||||
|
|
||||||
|
rq = RegisterQueue(queuename, ctrl, inst_per, instance_min, hostname_list)
|
||||||
|
rq.register()
|
||||||
|
if rq.instance_not_found_err:
|
||||||
|
print(rq.instance_not_found_err.message)
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|||||||
@@ -149,8 +149,11 @@ class InstanceManager(models.Manager):
|
|||||||
|
|
||||||
def get_or_register(self):
|
def get_or_register(self):
|
||||||
if settings.AWX_AUTO_DEPROVISION_INSTANCES:
|
if settings.AWX_AUTO_DEPROVISION_INSTANCES:
|
||||||
|
from awx.main.management.commands.register_queue import RegisterQueue
|
||||||
pod_ip = os.environ.get('MY_POD_IP')
|
pod_ip = os.environ.get('MY_POD_IP')
|
||||||
return self.register(ip_address=pod_ip)
|
registered = self.register(ip_address=pod_ip)
|
||||||
|
RegisterQueue('tower', None, 100, 0, []).register()
|
||||||
|
return registered
|
||||||
else:
|
else:
|
||||||
return (False, self.me())
|
return (False, self.me())
|
||||||
|
|
||||||
|
|||||||
@@ -3,7 +3,6 @@
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
import requests
|
import requests
|
||||||
import json
|
|
||||||
|
|
||||||
from django.utils.encoding import smart_text
|
from django.utils.encoding import smart_text
|
||||||
from django.utils.translation import ugettext_lazy as _
|
from django.utils.translation import ugettext_lazy as _
|
||||||
@@ -45,7 +44,7 @@ class MattermostBackend(AWXBaseEmailBackend, CustomNotificationBase):
|
|||||||
payload['text'] = m.subject
|
payload['text'] = m.subject
|
||||||
|
|
||||||
r = requests.post("{}".format(m.recipients()[0]),
|
r = requests.post("{}".format(m.recipients()[0]),
|
||||||
data=json.dumps(payload), verify=(not self.mattermost_no_verify_ssl))
|
json=payload, verify=(not self.mattermost_no_verify_ssl))
|
||||||
if r.status_code >= 400:
|
if r.status_code >= 400:
|
||||||
logger.error(smart_text(_("Error sending notification mattermost: {}").format(r.text)))
|
logger.error(smart_text(_("Error sending notification mattermost: {}").format(r.text)))
|
||||||
if not self.fail_silently:
|
if not self.fail_silently:
|
||||||
|
|||||||
@@ -581,3 +581,4 @@ class TaskManager():
|
|||||||
logger.debug("Starting Scheduler")
|
logger.debug("Starting Scheduler")
|
||||||
with task_manager_bulk_reschedule():
|
with task_manager_bulk_reschedule():
|
||||||
self._schedule()
|
self._schedule()
|
||||||
|
logger.debug("Finishing Scheduler")
|
||||||
|
|||||||
@@ -71,6 +71,18 @@ def test_node_accepts_prompted_fields(inventory, project, workflow_job_template,
|
|||||||
user=admin_user, expect=201)
|
user=admin_user, expect=201)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.django_db
|
||||||
|
@pytest.mark.parametrize("field_name, field_value", [
|
||||||
|
('all_parents_must_converge', True),
|
||||||
|
('all_parents_must_converge', False),
|
||||||
|
])
|
||||||
|
def test_create_node_with_field(field_name, field_value, workflow_job_template, post, admin_user):
|
||||||
|
url = reverse('api:workflow_job_template_workflow_nodes_list',
|
||||||
|
kwargs={'pk': workflow_job_template.pk})
|
||||||
|
res = post(url, {field_name: field_value}, user=admin_user, expect=201)
|
||||||
|
assert res.data[field_name] == field_value
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.django_db
|
@pytest.mark.django_db
|
||||||
class TestApprovalNodes():
|
class TestApprovalNodes():
|
||||||
def test_approval_node_creation(self, post, approval_node, admin_user):
|
def test_approval_node_creation(self, post, approval_node, admin_user):
|
||||||
|
|||||||
@@ -13,6 +13,10 @@ class RSysLogHandler(logging.handlers.SysLogHandler):
|
|||||||
|
|
||||||
append_nul = False
|
append_nul = False
|
||||||
|
|
||||||
|
def _connect_unixsocket(self, address):
|
||||||
|
super(RSysLogHandler, self)._connect_unixsocket(address)
|
||||||
|
self.socket.setblocking(False)
|
||||||
|
|
||||||
def emit(self, msg):
|
def emit(self, msg):
|
||||||
if not settings.LOG_AGGREGATOR_ENABLED:
|
if not settings.LOG_AGGREGATOR_ENABLED:
|
||||||
return
|
return
|
||||||
@@ -26,6 +30,14 @@ class RSysLogHandler(logging.handlers.SysLogHandler):
|
|||||||
# unfortunately, we can't log that because...rsyslogd is down (and
|
# unfortunately, we can't log that because...rsyslogd is down (and
|
||||||
# would just us back ddown this code path)
|
# would just us back ddown this code path)
|
||||||
pass
|
pass
|
||||||
|
except BlockingIOError:
|
||||||
|
# for <some reason>, rsyslogd is no longer reading from the domain socket, and
|
||||||
|
# we're unable to write any more to it without blocking (we've seen this behavior
|
||||||
|
# from time to time when logging is totally misconfigured;
|
||||||
|
# in this scenario, it also makes more sense to just drop the messages,
|
||||||
|
# because the alternative is blocking the socket.send() in the
|
||||||
|
# Python process, which we definitely don't want to do)
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
ColorHandler = logging.StreamHandler
|
ColorHandler = logging.StreamHandler
|
||||||
|
|||||||
@@ -439,10 +439,11 @@ CELERYBEAT_SCHEDULE = {
|
|||||||
}
|
}
|
||||||
|
|
||||||
# Django Caching Configuration
|
# Django Caching Configuration
|
||||||
|
DJANGO_REDIS_IGNORE_EXCEPTIONS = True
|
||||||
CACHES = {
|
CACHES = {
|
||||||
'default': {
|
'default': {
|
||||||
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
|
'BACKEND': 'django_redis.cache.RedisCache',
|
||||||
'LOCATION': 'unix:/var/run/memcached/memcached.sock'
|
'LOCATION': 'unix:/var/run/redis/redis.sock?db=1'
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -740,7 +740,9 @@ class SAMLOrgAttrField(HybridDictField):
|
|||||||
class SAMLTeamAttrTeamOrgMapField(HybridDictField):
|
class SAMLTeamAttrTeamOrgMapField(HybridDictField):
|
||||||
|
|
||||||
team = fields.CharField(required=True, allow_null=False)
|
team = fields.CharField(required=True, allow_null=False)
|
||||||
|
team_alias = fields.CharField(required=False, allow_null=True)
|
||||||
organization = fields.CharField(required=True, allow_null=False)
|
organization = fields.CharField(required=True, allow_null=False)
|
||||||
|
organization_alias = fields.CharField(required=False, allow_null=True)
|
||||||
|
|
||||||
child = _Forbidden()
|
child = _Forbidden()
|
||||||
|
|
||||||
|
|||||||
@@ -187,13 +187,22 @@ def update_user_teams_by_saml_attr(backend, details, user=None, *args, **kwargs)
|
|||||||
|
|
||||||
team_ids = []
|
team_ids = []
|
||||||
for team_name_map in team_map.get('team_org_map', []):
|
for team_name_map in team_map.get('team_org_map', []):
|
||||||
team_name = team_name_map.get('team', '')
|
team_name = team_name_map.get('team', None)
|
||||||
|
team_alias = team_name_map.get('team_alias', None)
|
||||||
|
organization_name = team_name_map.get('organization', None)
|
||||||
|
organization_alias = team_name_map.get('organization_alias', None)
|
||||||
if team_name in saml_team_names:
|
if team_name in saml_team_names:
|
||||||
if not team_name_map.get('organization', ''):
|
if not organization_name:
|
||||||
# Settings field validation should prevent this.
|
# Settings field validation should prevent this.
|
||||||
logger.error("organization name invalid for team {}".format(team_name))
|
logger.error("organization name invalid for team {}".format(team_name))
|
||||||
continue
|
continue
|
||||||
org = Organization.objects.get_or_create(name=team_name_map['organization'])[0]
|
|
||||||
|
if organization_alias:
|
||||||
|
organization_name = organization_alias
|
||||||
|
org = Organization.objects.get_or_create(name=organization_name)[0]
|
||||||
|
|
||||||
|
if team_alias:
|
||||||
|
team_name = team_alias
|
||||||
team = Team.objects.get_or_create(name=team_name, organization=org)[0]
|
team = Team.objects.get_or_create(name=team_name, organization=org)[0]
|
||||||
|
|
||||||
team_ids.append(team.id)
|
team_ids.append(team.id)
|
||||||
|
|||||||
@@ -193,6 +193,10 @@ class TestSAMLAttr():
|
|||||||
{'team': 'Red', 'organization': 'Default1'},
|
{'team': 'Red', 'organization': 'Default1'},
|
||||||
{'team': 'Green', 'organization': 'Default1'},
|
{'team': 'Green', 'organization': 'Default1'},
|
||||||
{'team': 'Green', 'organization': 'Default3'},
|
{'team': 'Green', 'organization': 'Default3'},
|
||||||
|
{
|
||||||
|
'team': 'Yellow', 'team_alias': 'Yellow_Alias',
|
||||||
|
'organization': 'Default4', 'organization_alias': 'Default4_Alias'
|
||||||
|
},
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
return MockSettings()
|
return MockSettings()
|
||||||
@@ -285,3 +289,18 @@ class TestSAMLAttr():
|
|||||||
assert Team.objects.get(name='Green', organization__name='Default1').member_role.members.count() == 3
|
assert Team.objects.get(name='Green', organization__name='Default1').member_role.members.count() == 3
|
||||||
assert Team.objects.get(name='Green', organization__name='Default3').member_role.members.count() == 3
|
assert Team.objects.get(name='Green', organization__name='Default3').member_role.members.count() == 3
|
||||||
|
|
||||||
|
def test_update_user_teams_alias_by_saml_attr(self, orgs, users, kwargs, mock_settings):
|
||||||
|
with mock.patch('django.conf.settings', mock_settings):
|
||||||
|
u1 = users[0]
|
||||||
|
|
||||||
|
# Test getting teams from attribute with team->org mapping
|
||||||
|
kwargs['response']['attributes']['groups'] = ['Yellow']
|
||||||
|
|
||||||
|
# Ensure team and org will be created
|
||||||
|
update_user_teams_by_saml_attr(None, None, u1, **kwargs)
|
||||||
|
|
||||||
|
assert Team.objects.filter(name='Yellow', organization__name='Default4').count() == 0
|
||||||
|
assert Team.objects.filter(name='Yellow_Alias', organization__name='Default4_Alias').count() == 1
|
||||||
|
assert Team.objects.get(
|
||||||
|
name='Yellow_Alias', organization__name='Default4_Alias').member_role.members.count() == 1
|
||||||
|
|
||||||
|
|||||||
@@ -71,6 +71,14 @@ class TestSAMLTeamAttrField():
|
|||||||
{'team': 'Engineering', 'organization': 'Ansible2'},
|
{'team': 'Engineering', 'organization': 'Ansible2'},
|
||||||
{'team': 'Engineering2', 'organization': 'Ansible'},
|
{'team': 'Engineering2', 'organization': 'Ansible'},
|
||||||
]},
|
]},
|
||||||
|
{'remove': True, 'saml_attr': 'foobar', 'team_org_map': [
|
||||||
|
{
|
||||||
|
'team': 'Engineering', 'team_alias': 'Engineering Team',
|
||||||
|
'organization': 'Ansible', 'organization_alias': 'Awesome Org'
|
||||||
|
},
|
||||||
|
{'team': 'Engineering', 'organization': 'Ansible2'},
|
||||||
|
{'team': 'Engineering2', 'organization': 'Ansible'},
|
||||||
|
]},
|
||||||
])
|
])
|
||||||
def test_internal_value_valid(self, data):
|
def test_internal_value_valid(self, data):
|
||||||
field = SAMLTeamAttrField()
|
field = SAMLTeamAttrField()
|
||||||
|
|||||||
6
awx/ui/package-lock.json
generated
6
awx/ui/package-lock.json
generated
@@ -14435,9 +14435,9 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"websocket-extensions": {
|
"websocket-extensions": {
|
||||||
"version": "0.1.3",
|
"version": "0.1.4",
|
||||||
"resolved": "https://registry.npmjs.org/websocket-extensions/-/websocket-extensions-0.1.3.tgz",
|
"resolved": "https://registry.npmjs.org/websocket-extensions/-/websocket-extensions-0.1.4.tgz",
|
||||||
"integrity": "sha512-nqHUnMXmBzT0w570r2JpJxfiSD1IzoI+HGVdd3aZ0yNi3ngvQ4jv1dtHt5VGxfI2yj5yqImPhOK4vmIh2xMbGg==",
|
"integrity": "sha512-OqedPIGOfsDlo31UNwYbCFMSaO9m9G/0faIHj5/dZFDMFqPTcx6UwqyOy3COEaEOg/9VsGIpdqn62W5KhoKSpg==",
|
||||||
"dev": true
|
"dev": true
|
||||||
},
|
},
|
||||||
"whet.extend": {
|
"whet.extend": {
|
||||||
|
|||||||
6
awx/ui_next/package-lock.json
generated
6
awx/ui_next/package-lock.json
generated
@@ -16320,9 +16320,9 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"websocket-extensions": {
|
"websocket-extensions": {
|
||||||
"version": "0.1.3",
|
"version": "0.1.4",
|
||||||
"resolved": "https://registry.npmjs.org/websocket-extensions/-/websocket-extensions-0.1.3.tgz",
|
"resolved": "https://registry.npmjs.org/websocket-extensions/-/websocket-extensions-0.1.4.tgz",
|
||||||
"integrity": "sha512-nqHUnMXmBzT0w570r2JpJxfiSD1IzoI+HGVdd3aZ0yNi3ngvQ4jv1dtHt5VGxfI2yj5yqImPhOK4vmIh2xMbGg=="
|
"integrity": "sha512-OqedPIGOfsDlo31UNwYbCFMSaO9m9G/0faIHj5/dZFDMFqPTcx6UwqyOy3COEaEOg/9VsGIpdqn62W5KhoKSpg=="
|
||||||
},
|
},
|
||||||
"whatwg-encoding": {
|
"whatwg-encoding": {
|
||||||
"version": "1.0.5",
|
"version": "1.0.5",
|
||||||
|
|||||||
@@ -40,7 +40,13 @@ class ClipboardCopyButton extends React.Component {
|
|||||||
};
|
};
|
||||||
|
|
||||||
render() {
|
render() {
|
||||||
const { clickTip, entryDelay, exitDelay, hoverTip } = this.props;
|
const {
|
||||||
|
copyTip,
|
||||||
|
entryDelay,
|
||||||
|
exitDelay,
|
||||||
|
copiedSuccessTip,
|
||||||
|
isDisabled,
|
||||||
|
} = this.props;
|
||||||
const { copied } = this.state;
|
const { copied } = this.state;
|
||||||
|
|
||||||
return (
|
return (
|
||||||
@@ -48,12 +54,13 @@ class ClipboardCopyButton extends React.Component {
|
|||||||
entryDelay={entryDelay}
|
entryDelay={entryDelay}
|
||||||
exitDelay={exitDelay}
|
exitDelay={exitDelay}
|
||||||
trigger="mouseenter focus click"
|
trigger="mouseenter focus click"
|
||||||
content={copied ? clickTip : hoverTip}
|
content={copied ? copiedSuccessTip : copyTip}
|
||||||
>
|
>
|
||||||
<Button
|
<Button
|
||||||
|
isDisabled={isDisabled}
|
||||||
variant="plain"
|
variant="plain"
|
||||||
onClick={this.handleCopyClick}
|
onClick={this.handleCopyClick}
|
||||||
aria-label={hoverTip}
|
aria-label={copyTip}
|
||||||
>
|
>
|
||||||
<CopyIcon />
|
<CopyIcon />
|
||||||
</Button>
|
</Button>
|
||||||
@@ -63,12 +70,13 @@ class ClipboardCopyButton extends React.Component {
|
|||||||
}
|
}
|
||||||
|
|
||||||
ClipboardCopyButton.propTypes = {
|
ClipboardCopyButton.propTypes = {
|
||||||
clickTip: PropTypes.string.isRequired,
|
copyTip: PropTypes.string.isRequired,
|
||||||
entryDelay: PropTypes.number,
|
entryDelay: PropTypes.number,
|
||||||
exitDelay: PropTypes.number,
|
exitDelay: PropTypes.number,
|
||||||
hoverTip: PropTypes.string.isRequired,
|
copiedSuccessTip: PropTypes.string.isRequired,
|
||||||
stringToCopy: PropTypes.string.isRequired,
|
stringToCopy: PropTypes.string.isRequired,
|
||||||
switchDelay: PropTypes.number,
|
switchDelay: PropTypes.number,
|
||||||
|
isDisabled: PropTypes.bool.isRequired,
|
||||||
};
|
};
|
||||||
|
|
||||||
ClipboardCopyButton.defaultProps = {
|
ClipboardCopyButton.defaultProps = {
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ describe('ClipboardCopyButton', () => {
|
|||||||
clickTip="foo"
|
clickTip="foo"
|
||||||
hoverTip="bar"
|
hoverTip="bar"
|
||||||
stringToCopy="foobar!"
|
stringToCopy="foobar!"
|
||||||
|
isDisabled={false}
|
||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
expect(wrapper).toHaveLength(1);
|
expect(wrapper).toHaveLength(1);
|
||||||
@@ -23,6 +24,7 @@ describe('ClipboardCopyButton', () => {
|
|||||||
clickTip="foo"
|
clickTip="foo"
|
||||||
hoverTip="bar"
|
hoverTip="bar"
|
||||||
stringToCopy="foobar!"
|
stringToCopy="foobar!"
|
||||||
|
isDisabled={false}
|
||||||
/>
|
/>
|
||||||
).find('ClipboardCopyButton');
|
).find('ClipboardCopyButton');
|
||||||
expect(wrapper.state('copied')).toBe(false);
|
expect(wrapper.state('copied')).toBe(false);
|
||||||
@@ -33,4 +35,15 @@ describe('ClipboardCopyButton', () => {
|
|||||||
wrapper.update();
|
wrapper.update();
|
||||||
expect(wrapper.state('copied')).toBe(false);
|
expect(wrapper.state('copied')).toBe(false);
|
||||||
});
|
});
|
||||||
|
test('should render disabled button', () => {
|
||||||
|
const wrapper = mountWithContexts(
|
||||||
|
<ClipboardCopyButton
|
||||||
|
clickTip="foo"
|
||||||
|
hoverTip="bar"
|
||||||
|
stringToCopy="foobar!"
|
||||||
|
isDisabled
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
expect(wrapper.find('Button').prop('isDisabled')).toBe(true);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -138,7 +138,7 @@ function getRouteConfig(i18n) {
|
|||||||
screen: InstanceGroups,
|
screen: InstanceGroups,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
title: i18n._(t`Integrations`),
|
title: i18n._(t`Applications`),
|
||||||
path: '/applications',
|
path: '/applications',
|
||||||
screen: Applications,
|
screen: Applications,
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -0,0 +1,26 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { Route, Switch, Redirect } from 'react-router-dom';
|
||||||
|
import ApplicationEdit from '../ApplicationEdit';
|
||||||
|
import ApplicationDetails from '../ApplicationDetails';
|
||||||
|
|
||||||
|
function Application() {
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Switch>
|
||||||
|
<Redirect
|
||||||
|
from="/applications/:id"
|
||||||
|
to="/applications/:id/details"
|
||||||
|
exact
|
||||||
|
/>
|
||||||
|
<Route path="/applications/:id/edit">
|
||||||
|
<ApplicationEdit />
|
||||||
|
</Route>
|
||||||
|
<Route path="/applications/:id/details">
|
||||||
|
<ApplicationDetails />
|
||||||
|
</Route>
|
||||||
|
</Switch>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export default Application;
|
||||||
1
awx/ui_next/src/screens/Application/Application/index.js
Normal file
1
awx/ui_next/src/screens/Application/Application/index.js
Normal file
@@ -0,0 +1 @@
|
|||||||
|
export { default } from './Application';
|
||||||
@@ -0,0 +1,15 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { Card, PageSection } from '@patternfly/react-core';
|
||||||
|
|
||||||
|
function ApplicatonAdd() {
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<PageSection>
|
||||||
|
<Card>
|
||||||
|
<div>Applications Add</div>
|
||||||
|
</Card>
|
||||||
|
</PageSection>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
export default ApplicatonAdd;
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
export { default } from './ApplicationAdd';
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { Card, PageSection } from '@patternfly/react-core';
|
||||||
|
|
||||||
|
function ApplicationDetails() {
|
||||||
|
return (
|
||||||
|
<PageSection>
|
||||||
|
<Card>Application Details</Card>
|
||||||
|
</PageSection>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
export default ApplicationDetails;
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
export { default } from './ApplicationDetails';
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { Card, PageSection } from '@patternfly/react-core';
|
||||||
|
|
||||||
|
function ApplicationEdit() {
|
||||||
|
return (
|
||||||
|
<PageSection>
|
||||||
|
<Card>Application Edit</Card>
|
||||||
|
</PageSection>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
export default ApplicationEdit;
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
export { default } from './ApplicationEdit';
|
||||||
@@ -1,26 +1,49 @@
|
|||||||
import React, { Component, Fragment } from 'react';
|
import React, { useState, useCallback } from 'react';
|
||||||
import { withI18n } from '@lingui/react';
|
import { withI18n } from '@lingui/react';
|
||||||
import { t } from '@lingui/macro';
|
import { t } from '@lingui/macro';
|
||||||
import {
|
import { Route, Switch } from 'react-router-dom';
|
||||||
PageSection,
|
|
||||||
PageSectionVariants,
|
|
||||||
Title,
|
|
||||||
} from '@patternfly/react-core';
|
|
||||||
|
|
||||||
class Applications extends Component {
|
import ApplicationsList from './ApplicationsList';
|
||||||
render() {
|
import ApplicationAdd from './ApplicationAdd';
|
||||||
const { i18n } = this.props;
|
import Application from './Application';
|
||||||
const { light } = PageSectionVariants;
|
import Breadcrumbs from '../../components/Breadcrumbs';
|
||||||
|
|
||||||
return (
|
function Applications({ i18n }) {
|
||||||
<Fragment>
|
const [breadcrumbConfig, setBreadcrumbConfig] = useState({
|
||||||
<PageSection variant={light} className="pf-m-condensed">
|
'/applications': i18n._(t`Applications`),
|
||||||
<Title size="2xl">{i18n._(t`Applications`)}</Title>
|
'/applications/add': i18n._(t`Create New Application`),
|
||||||
</PageSection>
|
});
|
||||||
<PageSection />
|
|
||||||
</Fragment>
|
const buildBreadcrumbConfig = useCallback(
|
||||||
);
|
application => {
|
||||||
}
|
if (!application) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setBreadcrumbConfig({
|
||||||
|
'/applications': i18n._(t`Applications`),
|
||||||
|
'/applications/add': i18n._(t`Create New Application`),
|
||||||
|
[`/application/${application.id}`]: `${application.name}`,
|
||||||
|
});
|
||||||
|
},
|
||||||
|
[i18n]
|
||||||
|
);
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Breadcrumbs breadcrumbConfig={breadcrumbConfig} />
|
||||||
|
<Switch>
|
||||||
|
<Route path="/applications/add">
|
||||||
|
<ApplicationAdd />
|
||||||
|
</Route>
|
||||||
|
<Route path="/applications/:id">
|
||||||
|
<Application setBreadcrumb={buildBreadcrumbConfig} />
|
||||||
|
</Route>
|
||||||
|
<Route path="/applications">
|
||||||
|
<ApplicationsList />
|
||||||
|
</Route>
|
||||||
|
</Switch>
|
||||||
|
</>
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
export default withI18n()(Applications);
|
export default withI18n()(Applications);
|
||||||
|
|||||||
@@ -7,12 +7,10 @@ import Applications from './Applications';
|
|||||||
describe('<Applications />', () => {
|
describe('<Applications />', () => {
|
||||||
let pageWrapper;
|
let pageWrapper;
|
||||||
let pageSections;
|
let pageSections;
|
||||||
let title;
|
|
||||||
|
|
||||||
beforeEach(() => {
|
beforeEach(() => {
|
||||||
pageWrapper = mountWithContexts(<Applications />);
|
pageWrapper = mountWithContexts(<Applications />);
|
||||||
pageSections = pageWrapper.find('PageSection');
|
pageSections = pageWrapper.find('PageSection');
|
||||||
title = pageWrapper.find('Title');
|
|
||||||
});
|
});
|
||||||
|
|
||||||
afterEach(() => {
|
afterEach(() => {
|
||||||
@@ -21,9 +19,7 @@ describe('<Applications />', () => {
|
|||||||
|
|
||||||
test('initially renders without crashing', () => {
|
test('initially renders without crashing', () => {
|
||||||
expect(pageWrapper.length).toBe(1);
|
expect(pageWrapper.length).toBe(1);
|
||||||
expect(pageSections.length).toBe(2);
|
expect(pageSections.length).toBe(1);
|
||||||
expect(title.length).toBe(1);
|
|
||||||
expect(title.props().size).toBe('2xl');
|
|
||||||
expect(pageSections.first().props().variant).toBe('light');
|
expect(pageSections.first().props().variant).toBe('light');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -0,0 +1,15 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { Card, PageSection } from '@patternfly/react-core';
|
||||||
|
|
||||||
|
function ApplicationsList() {
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<PageSection>
|
||||||
|
<Card>
|
||||||
|
<div>Applications List</div>
|
||||||
|
</Card>
|
||||||
|
</PageSection>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
export default ApplicationsList;
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
export { default } from './ApplicationsList';
|
||||||
@@ -32,6 +32,11 @@ const DataListAction = styled(_DataListAction)`
|
|||||||
grid-gap: 16px;
|
grid-gap: 16px;
|
||||||
grid-template-columns: repeat(3, 40px);
|
grid-template-columns: repeat(3, 40px);
|
||||||
`;
|
`;
|
||||||
|
|
||||||
|
const Label = styled.span`
|
||||||
|
color: var(--pf-global--disabled-color--100);
|
||||||
|
`;
|
||||||
|
|
||||||
function ProjectListItem({
|
function ProjectListItem({
|
||||||
project,
|
project,
|
||||||
isSelected,
|
isSelected,
|
||||||
@@ -121,13 +126,17 @@ function ProjectListItem({
|
|||||||
</DataListCell>,
|
</DataListCell>,
|
||||||
<DataListCell key="revision">
|
<DataListCell key="revision">
|
||||||
{project.scm_revision.substring(0, 7)}
|
{project.scm_revision.substring(0, 7)}
|
||||||
{project.scm_revision ? (
|
{!project.scm_revision && (
|
||||||
<ClipboardCopyButton
|
<Label aria-label={i18n._(t`copy to clipboard disabled`)}>
|
||||||
stringToCopy={project.scm_revision}
|
{i18n._(t`Sync for revision`)}
|
||||||
hoverTip={i18n._(t`Copy full revision to clipboard.`)}
|
</Label>
|
||||||
clickTip={i18n._(t`Successfully copied to clipboard!`)}
|
)}
|
||||||
/>
|
<ClipboardCopyButton
|
||||||
) : null}
|
isDisabled={!project.scm_revision}
|
||||||
|
stringToCopy={project.scm_revision}
|
||||||
|
copyTip={i18n._(t`Copy full revision to clipboard.`)}
|
||||||
|
copiedSuccessTip={i18n._(t`Successfully copied to clipboard!`)}
|
||||||
|
/>
|
||||||
</DataListCell>,
|
</DataListCell>,
|
||||||
]}
|
]}
|
||||||
/>
|
/>
|
||||||
|
|||||||
@@ -218,4 +218,34 @@ describe('<ProjectsListItem />', () => {
|
|||||||
);
|
);
|
||||||
expect(wrapper.find('CopyButton').length).toBe(0);
|
expect(wrapper.find('CopyButton').length).toBe(0);
|
||||||
});
|
});
|
||||||
|
test('should render disabled copy to clipboard button', () => {
|
||||||
|
const wrapper = mountWithContexts(
|
||||||
|
<ProjectsListItem
|
||||||
|
isSelected={false}
|
||||||
|
detailUrl="/project/1"
|
||||||
|
onSelect={() => {}}
|
||||||
|
project={{
|
||||||
|
id: 1,
|
||||||
|
name: 'Project 1',
|
||||||
|
url: '/api/v2/projects/1',
|
||||||
|
type: 'project',
|
||||||
|
scm_type: 'git',
|
||||||
|
scm_revision: '',
|
||||||
|
summary_fields: {
|
||||||
|
last_job: {
|
||||||
|
id: 9000,
|
||||||
|
status: 'successful',
|
||||||
|
},
|
||||||
|
user_capabilities: {
|
||||||
|
edit: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
expect(
|
||||||
|
wrapper.find('span[aria-label="copy to clipboard disabled"]').text()
|
||||||
|
).toBe('Sync for revision');
|
||||||
|
expect(wrapper.find('ClipboardCopyButton').prop('isDisabled')).toBe(true);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -19,7 +19,7 @@ function WorkflowJobTemplateEdit({ template }) {
|
|||||||
webhook_key,
|
webhook_key,
|
||||||
...templatePayload
|
...templatePayload
|
||||||
} = values;
|
} = values;
|
||||||
templatePayload.inventory = inventory?.id;
|
templatePayload.inventory = inventory?.id || null;
|
||||||
templatePayload.organization = organization?.id;
|
templatePayload.organization = organization?.id;
|
||||||
templatePayload.webhook_credential = webhook_credential?.id || null;
|
templatePayload.webhook_credential = webhook_credential?.id || null;
|
||||||
|
|
||||||
|
|||||||
@@ -1,13 +1,13 @@
|
|||||||
import React, { useState } from 'react';
|
import React, { useState } from 'react';
|
||||||
import { t } from '@lingui/macro';
|
import { t } from '@lingui/macro';
|
||||||
|
|
||||||
import PropTypes, { shape } from 'prop-types';
|
import PropTypes, { shape } from 'prop-types';
|
||||||
|
|
||||||
import { withI18n } from '@lingui/react';
|
import { withI18n } from '@lingui/react';
|
||||||
import { useField, withFormik } from 'formik';
|
import { useField, withFormik } from 'formik';
|
||||||
import { Form, FormGroup, Checkbox } from '@patternfly/react-core';
|
import { Form, FormGroup, Checkbox, TextInput } from '@patternfly/react-core';
|
||||||
import { required } from '../../../util/validators';
|
import { required } from '../../../util/validators';
|
||||||
|
|
||||||
|
import FieldWithPrompt from '../../../components/FieldWithPrompt';
|
||||||
import FormField, {
|
import FormField, {
|
||||||
FieldTooltip,
|
FieldTooltip,
|
||||||
FormSubmitError,
|
FormSubmitError,
|
||||||
@@ -36,19 +36,20 @@ function WorkflowJobTemplateForm({
|
|||||||
i18n,
|
i18n,
|
||||||
submitError,
|
submitError,
|
||||||
}) {
|
}) {
|
||||||
const [hasContentError, setContentError] = useState(null);
|
const [enableWebhooks, setEnableWebhooks] = useState(
|
||||||
|
Boolean(template.webhook_service)
|
||||||
const [organizationField, organizationMeta, organizationHelpers] = useField(
|
|
||||||
'organization'
|
|
||||||
);
|
);
|
||||||
|
const [hasContentError, setContentError] = useState(null);
|
||||||
|
const [askInventoryOnLaunchField] = useField('ask_inventory_on_launch');
|
||||||
const [inventoryField, inventoryMeta, inventoryHelpers] = useField(
|
const [inventoryField, inventoryMeta, inventoryHelpers] = useField(
|
||||||
'inventory'
|
'inventory'
|
||||||
);
|
);
|
||||||
const [labelsField, , labelsHelpers] = useField('labels');
|
const [labelsField, , labelsHelpers] = useField('labels');
|
||||||
|
const [limitField, limitMeta, limitHelpers] = useField('limit');
|
||||||
const [enableWebhooks, setEnableWebhooks] = useState(
|
const [organizationField, organizationMeta, organizationHelpers] = useField(
|
||||||
Boolean(template.webhook_service)
|
'organization'
|
||||||
);
|
);
|
||||||
|
const [scmField, , scmHelpers] = useField('scm_branch');
|
||||||
|
|
||||||
if (hasContentError) {
|
if (hasContentError) {
|
||||||
return <ContentError error={hasContentError} />;
|
return <ContentError error={hasContentError} />;
|
||||||
@@ -79,39 +80,74 @@ function WorkflowJobTemplateForm({
|
|||||||
value={organizationField.value}
|
value={organizationField.value}
|
||||||
isValid={!organizationMeta.error}
|
isValid={!organizationMeta.error}
|
||||||
/>
|
/>
|
||||||
<FormGroup label={i18n._(t`Inventory`)} fieldId="wfjt-inventory">
|
|
||||||
<FieldTooltip
|
<FieldWithPrompt
|
||||||
content={i18n._(
|
fieldId="wfjt-inventory"
|
||||||
t`Select an inventory for the workflow. This inventory is applied to all job template nodes that prompt for an inventory.`
|
label={i18n._(t`Inventory`)}
|
||||||
)}
|
promptId="wfjt-ask-inventory-on-launch"
|
||||||
/>
|
promptName="ask_inventory_on_launch"
|
||||||
|
tooltip={i18n._(
|
||||||
|
t`Select an inventory for the workflow. This inventory is applied to all job template nodes that prompt for an inventory.`
|
||||||
|
)}
|
||||||
|
>
|
||||||
<InventoryLookup
|
<InventoryLookup
|
||||||
value={inventoryField.value}
|
value={inventoryField.value}
|
||||||
isValid={!inventoryMeta.error}
|
onBlur={() => inventoryHelpers.setTouched()}
|
||||||
helperTextInvalid={inventoryMeta.error}
|
|
||||||
onChange={value => {
|
onChange={value => {
|
||||||
inventoryHelpers.setValue(value || null);
|
inventoryHelpers.setValue(value);
|
||||||
|
}}
|
||||||
|
required={askInventoryOnLaunchField.value}
|
||||||
|
touched={inventoryMeta.touched}
|
||||||
|
error={inventoryMeta.error}
|
||||||
|
/>
|
||||||
|
{(inventoryMeta.touched || askInventoryOnLaunchField.value) &&
|
||||||
|
inventoryMeta.error && (
|
||||||
|
<div
|
||||||
|
className="pf-c-form__helper-text pf-m-error"
|
||||||
|
aria-live="polite"
|
||||||
|
>
|
||||||
|
{inventoryMeta.error}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</FieldWithPrompt>
|
||||||
|
|
||||||
|
<FieldWithPrompt
|
||||||
|
fieldId="wjft-limit"
|
||||||
|
label={i18n._(t`Limit`)}
|
||||||
|
promptId="template-ask-limit-on-launch"
|
||||||
|
promptName="ask_limit_on_launch"
|
||||||
|
tooltip={i18n._(t`Provide a host pattern to further constrain
|
||||||
|
the list of hosts that will be managed or affected by the
|
||||||
|
playbook. Multiple patterns are allowed. Refer to Ansible
|
||||||
|
documentation for more information and examples on patterns.`)}
|
||||||
|
>
|
||||||
|
<TextInput
|
||||||
|
id="text-wfjt-limit"
|
||||||
|
{...limitField}
|
||||||
|
isValid={!limitMeta.touched || !limitMeta.error}
|
||||||
|
onChange={value => {
|
||||||
|
limitHelpers.setValue(value);
|
||||||
}}
|
}}
|
||||||
/>
|
/>
|
||||||
</FormGroup>
|
</FieldWithPrompt>
|
||||||
<FormField
|
|
||||||
type="text"
|
<FieldWithPrompt
|
||||||
name="limit"
|
fieldId="wfjt-scm-branch"
|
||||||
id="wfjt-limit"
|
label={i18n._(t`Source control branch`)}
|
||||||
label={i18n._(t`Limit`)}
|
promptId="wfjt-ask-scm-branch-on-launch"
|
||||||
tooltip={i18n._(
|
promptName="ask_scm_branch_on_launch"
|
||||||
t`Provide a host pattern to further constrain the list of hosts that will be managed or affected by the workflow. This limit is applied to all job template nodes that prompt for a limit. Refer to Ansible documentation for more information and examples on patterns.`
|
|
||||||
)}
|
|
||||||
/>
|
|
||||||
<FormField
|
|
||||||
type="text"
|
|
||||||
label={i18n._(t`Source Control Branch`)}
|
|
||||||
tooltip={i18n._(
|
tooltip={i18n._(
|
||||||
t`Select a branch for the workflow. This branch is applied to all job template nodes that prompt for a branch.`
|
t`Select a branch for the workflow. This branch is applied to all job template nodes that prompt for a branch.`
|
||||||
)}
|
)}
|
||||||
id="wfjt-scm_branch"
|
>
|
||||||
name="scm_branch"
|
<TextInput
|
||||||
/>
|
id="text-wfjt-scm-branch"
|
||||||
|
{...scmField}
|
||||||
|
onChange={value => {
|
||||||
|
scmHelpers.setValue(value);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</FieldWithPrompt>
|
||||||
</FormColumnLayout>
|
</FormColumnLayout>
|
||||||
<FormFullWidthLayout>
|
<FormFullWidthLayout>
|
||||||
<FormGroup label={i18n._(t`Labels`)} fieldId="template-labels">
|
<FormGroup label={i18n._(t`Labels`)} fieldId="template-labels">
|
||||||
@@ -133,6 +169,7 @@ function WorkflowJobTemplateForm({
|
|||||||
id="wfjt-variables"
|
id="wfjt-variables"
|
||||||
name="extra_vars"
|
name="extra_vars"
|
||||||
label={i18n._(t`Variables`)}
|
label={i18n._(t`Variables`)}
|
||||||
|
promptId="template-ask-variables-on-launch"
|
||||||
tooltip={i18n._(
|
tooltip={i18n._(
|
||||||
t`Pass extra command line variables to the playbook. This is the -e or --extra-vars command line parameter for ansible-playbook. Provide key/value pairs using either YAML or JSON. Refer to the Ansible Tower documentation for example syntax.`
|
t`Pass extra command line variables to the playbook. This is the -e or --extra-vars command line parameter for ansible-playbook. Provide key/value pairs using either YAML or JSON. Refer to the Ansible Tower documentation for example syntax.`
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -114,9 +114,9 @@ describe('<WorkflowJobTemplateForm/>', () => {
|
|||||||
'FormField[name="name"]',
|
'FormField[name="name"]',
|
||||||
'FormField[name="description"]',
|
'FormField[name="description"]',
|
||||||
'FormGroup[label="Organization"]',
|
'FormGroup[label="Organization"]',
|
||||||
'FormGroup[label="Inventory"]',
|
'FieldWithPrompt[label="Inventory"]',
|
||||||
'FormField[name="limit"]',
|
'FieldWithPrompt[label="Limit"]',
|
||||||
'FormField[name="scm_branch"]',
|
'FieldWithPrompt[label="Source control branch"]',
|
||||||
'FormGroup[label="Labels"]',
|
'FormGroup[label="Labels"]',
|
||||||
'VariablesField',
|
'VariablesField',
|
||||||
];
|
];
|
||||||
@@ -137,11 +137,6 @@ describe('<WorkflowJobTemplateForm/>', () => {
|
|||||||
element: 'wfjt-description',
|
element: 'wfjt-description',
|
||||||
value: { value: 'new bar', name: 'description' },
|
value: { value: 'new bar', name: 'description' },
|
||||||
},
|
},
|
||||||
{ element: 'wfjt-limit', value: { value: 1234567890, name: 'limit' } },
|
|
||||||
{
|
|
||||||
element: 'wfjt-scm_branch',
|
|
||||||
value: { value: 'new branch', name: 'scm_branch' },
|
|
||||||
},
|
|
||||||
];
|
];
|
||||||
const changeInputs = async ({ element, value }) => {
|
const changeInputs = async ({ element, value }) => {
|
||||||
wrapper.find(`input#${element}`).simulate('change', {
|
wrapper.find(`input#${element}`).simulate('change', {
|
||||||
@@ -177,6 +172,26 @@ describe('<WorkflowJobTemplateForm/>', () => {
|
|||||||
inputsToChange.map(input => assertChanges(input));
|
inputsToChange.map(input => assertChanges(input));
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test('test changes in FieldWithPrompt', async () => {
|
||||||
|
await act(async () => {
|
||||||
|
wrapper.find('TextInputBase#text-wfjt-scm-branch').prop('onChange')(
|
||||||
|
'main'
|
||||||
|
);
|
||||||
|
wrapper.find('TextInputBase#text-wfjt-limit').prop('onChange')(
|
||||||
|
1234567890
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
wrapper.update();
|
||||||
|
|
||||||
|
expect(wrapper.find('input#text-wfjt-scm-branch').prop('value')).toEqual(
|
||||||
|
'main'
|
||||||
|
);
|
||||||
|
expect(wrapper.find('input#text-wfjt-limit').prop('value')).toEqual(
|
||||||
|
1234567890
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
test('webhooks and enable concurrent jobs functions properly', async () => {
|
test('webhooks and enable concurrent jobs functions properly', async () => {
|
||||||
act(() => {
|
act(() => {
|
||||||
wrapper.find('Checkbox[aria-label="Enable Webhook"]').invoke('onChange')(
|
wrapper.find('Checkbox[aria-label="Enable Webhook"]').invoke('onChange')(
|
||||||
|
|||||||
@@ -31,8 +31,12 @@ options:
|
|||||||
tower_oauthtoken:
|
tower_oauthtoken:
|
||||||
description:
|
description:
|
||||||
- The Tower OAuth token to use.
|
- The Tower OAuth token to use.
|
||||||
|
- This value can be in one of two formats.
|
||||||
|
- A string which is the token itself. (i.e. bqV5txm97wqJqtkxlMkhQz0pKhRMMX)
|
||||||
|
- A dictionary structure as returned by the tower_token module.
|
||||||
- If value not set, will try environment variable C(TOWER_OAUTH_TOKEN) and then config files
|
- If value not set, will try environment variable C(TOWER_OAUTH_TOKEN) and then config files
|
||||||
type: str
|
type: raw
|
||||||
|
version_added: "3.7"
|
||||||
validate_certs:
|
validate_certs:
|
||||||
description:
|
description:
|
||||||
- Whether to allow insecure connections to Tower or AWX.
|
- Whether to allow insecure connections to Tower or AWX.
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ __metaclass__ = type
|
|||||||
|
|
||||||
from ansible.module_utils.basic import AnsibleModule, env_fallback
|
from ansible.module_utils.basic import AnsibleModule, env_fallback
|
||||||
from ansible.module_utils.urls import Request, SSLValidationError, ConnectionError
|
from ansible.module_utils.urls import Request, SSLValidationError, ConnectionError
|
||||||
from ansible.module_utils.six import PY2
|
from ansible.module_utils.six import PY2, string_types
|
||||||
from ansible.module_utils.six.moves import StringIO
|
from ansible.module_utils.six.moves import StringIO
|
||||||
from ansible.module_utils.six.moves.urllib.parse import urlparse, urlencode
|
from ansible.module_utils.six.moves.urllib.parse import urlparse, urlencode
|
||||||
from ansible.module_utils.six.moves.urllib.error import HTTPError
|
from ansible.module_utils.six.moves.urllib.error import HTTPError
|
||||||
@@ -47,7 +47,7 @@ class TowerModule(AnsibleModule):
|
|||||||
tower_username=dict(required=False, fallback=(env_fallback, ['TOWER_USERNAME'])),
|
tower_username=dict(required=False, fallback=(env_fallback, ['TOWER_USERNAME'])),
|
||||||
tower_password=dict(no_log=True, required=False, fallback=(env_fallback, ['TOWER_PASSWORD'])),
|
tower_password=dict(no_log=True, required=False, fallback=(env_fallback, ['TOWER_PASSWORD'])),
|
||||||
validate_certs=dict(type='bool', aliases=['tower_verify_ssl'], required=False, fallback=(env_fallback, ['TOWER_VERIFY_SSL'])),
|
validate_certs=dict(type='bool', aliases=['tower_verify_ssl'], required=False, fallback=(env_fallback, ['TOWER_VERIFY_SSL'])),
|
||||||
tower_oauthtoken=dict(type='str', no_log=True, required=False, fallback=(env_fallback, ['TOWER_OAUTH_TOKEN'])),
|
tower_oauthtoken=dict(type='raw', no_log=True, required=False, fallback=(env_fallback, ['TOWER_OAUTH_TOKEN'])),
|
||||||
tower_config_file=dict(type='path', required=False, default=None),
|
tower_config_file=dict(type='path', required=False, default=None),
|
||||||
)
|
)
|
||||||
short_params = {
|
short_params = {
|
||||||
@@ -96,6 +96,20 @@ class TowerModule(AnsibleModule):
|
|||||||
if direct_value is not None:
|
if direct_value is not None:
|
||||||
setattr(self, short_param, direct_value)
|
setattr(self, short_param, direct_value)
|
||||||
|
|
||||||
|
# Perform magic depending on whether tower_oauthtoken is a string or a dict
|
||||||
|
if self.params.get('tower_oauthtoken'):
|
||||||
|
token_param = self.params.get('tower_oauthtoken')
|
||||||
|
if type(token_param) is dict:
|
||||||
|
if 'token' in token_param:
|
||||||
|
self.oauth_token = self.params.get('tower_oauthtoken')['token']
|
||||||
|
else:
|
||||||
|
self.fail_json(msg="The provided dict in tower_oauthtoken did not properly contain the token entry")
|
||||||
|
elif isinstance(token_param, string_types):
|
||||||
|
self.oauth_token = self.params.get('tower_oauthtoken')
|
||||||
|
else:
|
||||||
|
error_msg = "The provided tower_oauthtoken type was not valid ({0}). Valid options are str or dict.".format(type(token_param).__name__)
|
||||||
|
self.fail_json(msg=error_msg)
|
||||||
|
|
||||||
# Perform some basic validation
|
# Perform some basic validation
|
||||||
if not re.match('^https{0,1}://', self.host):
|
if not re.match('^https{0,1}://', self.host):
|
||||||
self.host = "https://{0}".format(self.host)
|
self.host = "https://{0}".format(self.host)
|
||||||
@@ -504,6 +518,9 @@ class TowerModule(AnsibleModule):
|
|||||||
item_name = existing_item['username']
|
item_name = existing_item['username']
|
||||||
elif 'identifier' in existing_item:
|
elif 'identifier' in existing_item:
|
||||||
item_name = existing_item['identifier']
|
item_name = existing_item['identifier']
|
||||||
|
elif item_type == 'o_auth2_access_token':
|
||||||
|
# An oauth2 token has no name, instead we will use its id for any of the messages
|
||||||
|
item_name = existing_item['id']
|
||||||
else:
|
else:
|
||||||
self.fail_json(msg="Unable to process delete of {0} due to missing name".format(item_type))
|
self.fail_json(msg="Unable to process delete of {0} due to missing name".format(item_type))
|
||||||
|
|
||||||
|
|||||||
@@ -44,6 +44,14 @@ options:
|
|||||||
description:
|
description:
|
||||||
- Name of the inventory to use for the job template.
|
- Name of the inventory to use for the job template.
|
||||||
type: str
|
type: str
|
||||||
|
organization:
|
||||||
|
description:
|
||||||
|
- Organization the job template exists in.
|
||||||
|
- Used to help lookup the object, cannot be modified using this module.
|
||||||
|
- The Organization is inferred from the associated project
|
||||||
|
- If not provided, will lookup by name only, which does not work with duplicates.
|
||||||
|
- Requires Tower Version 3.7.0 or AWX 10.0.0 IS NOT backwards compatible with earlier versions.
|
||||||
|
type: str
|
||||||
project:
|
project:
|
||||||
description:
|
description:
|
||||||
- Name of the project to use for the job template.
|
- Name of the project to use for the job template.
|
||||||
@@ -282,6 +290,7 @@ EXAMPLES = '''
|
|||||||
tower_job_template:
|
tower_job_template:
|
||||||
name: "Ping"
|
name: "Ping"
|
||||||
job_type: "run"
|
job_type: "run"
|
||||||
|
organization: "Default"
|
||||||
inventory: "Local"
|
inventory: "Local"
|
||||||
project: "Demo"
|
project: "Demo"
|
||||||
playbook: "ping.yml"
|
playbook: "ping.yml"
|
||||||
@@ -332,6 +341,7 @@ def main():
|
|||||||
name=dict(required=True),
|
name=dict(required=True),
|
||||||
new_name=dict(),
|
new_name=dict(),
|
||||||
description=dict(default=''),
|
description=dict(default=''),
|
||||||
|
organization=dict(),
|
||||||
job_type=dict(choices=['run', 'check']),
|
job_type=dict(choices=['run', 'check']),
|
||||||
inventory=dict(),
|
inventory=dict(),
|
||||||
project=dict(),
|
project=dict(),
|
||||||
@@ -398,19 +408,24 @@ def main():
|
|||||||
credentials = []
|
credentials = []
|
||||||
credentials.append(credential)
|
credentials.append(credential)
|
||||||
|
|
||||||
|
new_fields = {}
|
||||||
|
search_fields = {'name': name}
|
||||||
|
|
||||||
|
# Attempt to look up the related items the user specified (these will fail the module if not found)
|
||||||
|
organization_id = None
|
||||||
|
organization = module.params.get('organization')
|
||||||
|
if organization:
|
||||||
|
organization_id = module.resolve_name_to_id('organizations', organization)
|
||||||
|
search_fields['organization'] = new_fields['organization'] = organization_id
|
||||||
|
|
||||||
# Attempt to look up an existing item based on the provided data
|
# Attempt to look up an existing item based on the provided data
|
||||||
existing_item = module.get_one('job_templates', **{
|
existing_item = module.get_one('job_templates', **{'data': search_fields})
|
||||||
'data': {
|
|
||||||
'name': name,
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
if state == 'absent':
|
if state == 'absent':
|
||||||
# If the state was absent we can let the module delete it if needed, the module will handle exiting from this
|
# If the state was absent we can let the module delete it if needed, the module will handle exiting from this
|
||||||
module.delete_if_needed(existing_item)
|
module.delete_if_needed(existing_item)
|
||||||
|
|
||||||
# Create the data that gets sent for create and update
|
# Create the data that gets sent for create and update
|
||||||
new_fields = {}
|
|
||||||
new_fields['name'] = new_name if new_name else name
|
new_fields['name'] = new_name if new_name else name
|
||||||
for field_name in (
|
for field_name in (
|
||||||
'description', 'job_type', 'playbook', 'scm_branch', 'forks', 'limit', 'verbosity',
|
'description', 'job_type', 'playbook', 'scm_branch', 'forks', 'limit', 'verbosity',
|
||||||
@@ -437,7 +452,20 @@ def main():
|
|||||||
if inventory is not None:
|
if inventory is not None:
|
||||||
new_fields['inventory'] = module.resolve_name_to_id('inventories', inventory)
|
new_fields['inventory'] = module.resolve_name_to_id('inventories', inventory)
|
||||||
if project is not None:
|
if project is not None:
|
||||||
new_fields['project'] = module.resolve_name_to_id('projects', project)
|
if organization_id is not None:
|
||||||
|
project_data = module.get_one('projects', **{
|
||||||
|
'data': {
|
||||||
|
'name': project,
|
||||||
|
'organization': organization_id,
|
||||||
|
}
|
||||||
|
})
|
||||||
|
if project_data is None:
|
||||||
|
module.fail_json(msg="The project {0} in organization {1} was not found on the Tower server".format(
|
||||||
|
project, organization
|
||||||
|
))
|
||||||
|
new_fields['project'] = project_data['id']
|
||||||
|
else:
|
||||||
|
new_fields['project'] = module.resolve_name_to_id('projects', project)
|
||||||
if webhook_credential is not None:
|
if webhook_credential is not None:
|
||||||
new_fields['webhook_credential'] = module.resolve_name_to_id('credentials', webhook_credential)
|
new_fields['webhook_credential'] = module.resolve_name_to_id('credentials', webhook_credential)
|
||||||
|
|
||||||
|
|||||||
@@ -80,6 +80,10 @@ except ImportError:
|
|||||||
|
|
||||||
|
|
||||||
def coerce_type(module, value):
|
def coerce_type(module, value):
|
||||||
|
# If our value is already None we can just return directly
|
||||||
|
if value is None:
|
||||||
|
return value
|
||||||
|
|
||||||
yaml_ish = bool((
|
yaml_ish = bool((
|
||||||
value.startswith('{') and value.endswith('}')
|
value.startswith('{') and value.endswith('}')
|
||||||
) or (
|
) or (
|
||||||
|
|||||||
201
awx_collection/plugins/modules/tower_token.py
Normal file
201
awx_collection/plugins/modules/tower_token.py
Normal file
@@ -0,0 +1,201 @@
|
|||||||
|
#!/usr/bin/python
|
||||||
|
# coding: utf-8 -*-
|
||||||
|
|
||||||
|
|
||||||
|
# (c) 2020, John Westcott IV <john.westcott.iv@redhat.com>
|
||||||
|
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||||
|
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
__metaclass__ = type
|
||||||
|
|
||||||
|
|
||||||
|
ANSIBLE_METADATA = {'metadata_version': '1.1',
|
||||||
|
'status': ['preview'],
|
||||||
|
'supported_by': 'community'}
|
||||||
|
|
||||||
|
DOCUMENTATION = '''
|
||||||
|
---
|
||||||
|
module: tower_token
|
||||||
|
author: "John Westcott IV (@john-westcott-iv)"
|
||||||
|
version_added: "2.3"
|
||||||
|
short_description: create, update, or destroy Ansible Tower tokens.
|
||||||
|
description:
|
||||||
|
- Create or destroy Ansible Tower tokens. See
|
||||||
|
U(https://www.ansible.com/tower) for an overview.
|
||||||
|
- In addition, the module sets an Ansible fact which can be passed into other
|
||||||
|
tower_* modules as the parameter tower_oauthtoken. See examples for usage.
|
||||||
|
- Because of the sensitive nature of tokens, the created token value is only available once
|
||||||
|
through the Ansible fact. (See RETURN for details)
|
||||||
|
- Due to the nature of tokens in Tower this module is not idempotent. A second will
|
||||||
|
with the same parameters will create a new token.
|
||||||
|
- If you are creating a temporary token for use with modules you should delete the token
|
||||||
|
when you are done with it. See the example for how to do it.
|
||||||
|
options:
|
||||||
|
description:
|
||||||
|
description:
|
||||||
|
- Optional description of this access token.
|
||||||
|
required: False
|
||||||
|
type: str
|
||||||
|
default: ''
|
||||||
|
application:
|
||||||
|
description:
|
||||||
|
- The application tied to this token.
|
||||||
|
required: False
|
||||||
|
type: str
|
||||||
|
scope:
|
||||||
|
description:
|
||||||
|
- Allowed scopes, further restricts user's permissions. Must be a simple space-separated string with allowed scopes ['read', 'write'].
|
||||||
|
required: False
|
||||||
|
type: str
|
||||||
|
default: 'write'
|
||||||
|
choices: ["read", "write"]
|
||||||
|
existing_token:
|
||||||
|
description: The data structure produced from tower_token in create mode to be used with state absent.
|
||||||
|
type: dict
|
||||||
|
existing_token_id:
|
||||||
|
description: A token ID (number) which can be used to delete an arbitrary token with state absent.
|
||||||
|
type: str
|
||||||
|
state:
|
||||||
|
description:
|
||||||
|
- Desired state of the resource.
|
||||||
|
choices: ["present", "absent"]
|
||||||
|
default: "present"
|
||||||
|
type: str
|
||||||
|
extends_documentation_fragment: awx.awx.auth
|
||||||
|
'''
|
||||||
|
|
||||||
|
EXAMPLES = '''
|
||||||
|
- block:
|
||||||
|
- name: Create a new token using an existing token
|
||||||
|
tower_token:
|
||||||
|
description: '{{ token_description }}'
|
||||||
|
scope: "write"
|
||||||
|
state: present
|
||||||
|
tower_oauthtoken: "{{ my_existing_token }}"
|
||||||
|
|
||||||
|
- name: Delete this token
|
||||||
|
tower_token:
|
||||||
|
existing_token: "{{ tower_token }}"
|
||||||
|
state: absent
|
||||||
|
|
||||||
|
- name: Create a new token using username/password
|
||||||
|
tower_token:
|
||||||
|
description: '{{ token_description }}'
|
||||||
|
scope: "write"
|
||||||
|
state: present
|
||||||
|
tower_username: "{{ my_username }}"
|
||||||
|
tower_password: "{{ my_password }}"
|
||||||
|
|
||||||
|
- name: Use our new token to make another call
|
||||||
|
tower_job_list:
|
||||||
|
tower_oauthtoken: "{{ tower_token }}"
|
||||||
|
|
||||||
|
always:
|
||||||
|
- name: Delete our Token with the token we created
|
||||||
|
tower_token:
|
||||||
|
existing_token: "{{ tower_token }}"
|
||||||
|
state: absent
|
||||||
|
when: tower_token is defined
|
||||||
|
|
||||||
|
- name: Delete a token by its id
|
||||||
|
tower_token:
|
||||||
|
existing_token_id: 4
|
||||||
|
state: absent
|
||||||
|
'''
|
||||||
|
|
||||||
|
RETURN = '''
|
||||||
|
tower_token:
|
||||||
|
type: dict
|
||||||
|
description: An Ansible Fact variable representing a Tower token object which can be used for auth in subsequent modules. See examples for usage.
|
||||||
|
contains:
|
||||||
|
token:
|
||||||
|
description: The token that was generated. This token can never be accessed again, make sure this value is noted before it is lost.
|
||||||
|
type: str
|
||||||
|
id:
|
||||||
|
description: The numeric ID of the token created
|
||||||
|
type: str
|
||||||
|
returned: on successful create
|
||||||
|
'''
|
||||||
|
|
||||||
|
from ..module_utils.tower_api import TowerModule
|
||||||
|
|
||||||
|
|
||||||
|
def return_token(module, last_response):
|
||||||
|
# A token is special because you can never get the actual token ID back from the API.
|
||||||
|
# So the default module return would give you an ID but then the token would forever be masked on you.
|
||||||
|
# This method will return the entire token object we got back so that a user has access to the token
|
||||||
|
|
||||||
|
module.json_output['ansible_facts'] = {
|
||||||
|
'tower_token': last_response,
|
||||||
|
}
|
||||||
|
module.exit_json(**module.json_output)
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
# Any additional arguments that are not fields of the item can be added here
|
||||||
|
argument_spec = dict(
|
||||||
|
description=dict(),
|
||||||
|
application=dict(),
|
||||||
|
scope=dict(choices=['read', 'write'], default='write'),
|
||||||
|
existing_token=dict(type='dict'),
|
||||||
|
existing_token_id=dict(),
|
||||||
|
state=dict(choices=['present', 'absent'], default='present'),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create a module for ourselves
|
||||||
|
module = TowerModule(
|
||||||
|
argument_spec=argument_spec,
|
||||||
|
mutually_exclusive=[
|
||||||
|
('existing_token', 'existing_token_id'),
|
||||||
|
],
|
||||||
|
# If we are state absent make sure one of existing_token or existing_token_id are present
|
||||||
|
required_if=[
|
||||||
|
['state', 'absent', ('existing_token', 'existing_token_id'), True, ],
|
||||||
|
],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Extract our parameters
|
||||||
|
description = module.params.get('description')
|
||||||
|
application = module.params.get('application')
|
||||||
|
scope = module.params.get('scope')
|
||||||
|
existing_token = module.params.get('existing_token')
|
||||||
|
existing_token_id = module.params.get('existing_token_id')
|
||||||
|
state = module.params.get('state')
|
||||||
|
|
||||||
|
if state == 'absent':
|
||||||
|
if not existing_token:
|
||||||
|
existing_token = module.get_one('tokens', **{
|
||||||
|
'data': {
|
||||||
|
'id': existing_token_id,
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
# If the state was absent we can let the module delete it if needed, the module will handle exiting from this
|
||||||
|
module.delete_if_needed(existing_token)
|
||||||
|
|
||||||
|
# Attempt to look up the related items the user specified (these will fail the module if not found)
|
||||||
|
application_id = None
|
||||||
|
if application:
|
||||||
|
application_id = module.resolve_name_to_id('applications', application)
|
||||||
|
|
||||||
|
# Create the data that gets sent for create and update
|
||||||
|
new_fields = {}
|
||||||
|
if description is not None:
|
||||||
|
new_fields['description'] = description
|
||||||
|
if application is not None:
|
||||||
|
new_fields['application'] = application_id
|
||||||
|
if scope is not None:
|
||||||
|
new_fields['scope'] = scope
|
||||||
|
|
||||||
|
# If the state was present and we can let the module build or update the existing item, this will return on its own
|
||||||
|
module.create_or_update_if_needed(
|
||||||
|
None, new_fields,
|
||||||
|
endpoint='tokens', item_type='token',
|
||||||
|
associations={
|
||||||
|
},
|
||||||
|
on_create=return_token,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
29
awx_collection/test/awx/test_token.py
Normal file
29
awx_collection/test/awx/test_token.py
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
from __future__ import (absolute_import, division, print_function)
|
||||||
|
__metaclass__ = type
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from awx.main.models import OAuth2AccessToken
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.django_db
|
||||||
|
def test_create_token(run_module, admin_user):
|
||||||
|
|
||||||
|
module_args = {
|
||||||
|
'description': 'barfoo',
|
||||||
|
'state': 'present',
|
||||||
|
'scope': 'read',
|
||||||
|
'tower_host': None,
|
||||||
|
'tower_username': None,
|
||||||
|
'tower_password': None,
|
||||||
|
'validate_certs': None,
|
||||||
|
'tower_oauthtoken': None,
|
||||||
|
'tower_config_file': None,
|
||||||
|
}
|
||||||
|
|
||||||
|
result = run_module('tower_token', module_args, admin_user)
|
||||||
|
assert result.get('changed'), result
|
||||||
|
|
||||||
|
tokens = OAuth2AccessToken.objects.filter(description='barfoo')
|
||||||
|
assert len(tokens) == 1, 'Tokens with description of barfoo != 0: {0}'.format(len(tokens))
|
||||||
|
assert tokens[0].scope == 'read', 'Token was not given read access'
|
||||||
@@ -74,3 +74,14 @@
|
|||||||
- assert:
|
- assert:
|
||||||
that:
|
that:
|
||||||
- "result is changed"
|
- "result is changed"
|
||||||
|
|
||||||
|
- name: Handle an omit value
|
||||||
|
tower_settings:
|
||||||
|
name: AWX_PROOT_BASE_PATH
|
||||||
|
value: '{{ junk_var | default(omit) }}'
|
||||||
|
register: result
|
||||||
|
ignore_errors: true
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- "'Unable to update settings' in result.msg"
|
||||||
|
|||||||
@@ -0,0 +1,110 @@
|
|||||||
|
---
|
||||||
|
- name: Generate names
|
||||||
|
set_fact:
|
||||||
|
token_description: "AWX-Collection-tests-tower_token-description-{{ lookup('password', '/dev/null chars=ascii_letters length=16') }}"
|
||||||
|
|
||||||
|
- name: Try to use a token as a dict which is missing the token parameter
|
||||||
|
tower_job_list:
|
||||||
|
tower_oauthtoken:
|
||||||
|
not_token: "This has no token entry"
|
||||||
|
register: results
|
||||||
|
ignore_errors: true
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is failed
|
||||||
|
- '"The provided dict in tower_oauthtoken did not properly contain the token entry" == results.msg'
|
||||||
|
|
||||||
|
- name: Try to use a token as a list
|
||||||
|
tower_job_list:
|
||||||
|
tower_oauthtoken:
|
||||||
|
- dummy_token
|
||||||
|
register: results
|
||||||
|
ignore_errors: true
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is failed
|
||||||
|
- '"The provided tower_oauthtoken type was not valid (list). Valid options are str or dict." == results.msg'
|
||||||
|
|
||||||
|
- name: Try to delete a token with no existing_token or existing_token_id
|
||||||
|
tower_token:
|
||||||
|
state: absent
|
||||||
|
register: results
|
||||||
|
ignore_errors: true
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is failed
|
||||||
|
# We don't assert a message here because it handled by ansible
|
||||||
|
|
||||||
|
- name: Try to delete a token with both existing_token or existing_token_id
|
||||||
|
tower_token:
|
||||||
|
existing_token:
|
||||||
|
id: 1234
|
||||||
|
existing_token_id: 1234
|
||||||
|
state: absent
|
||||||
|
register: results
|
||||||
|
ignore_errors: true
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is failed
|
||||||
|
# We don't assert a message here because it handled by ansible
|
||||||
|
|
||||||
|
|
||||||
|
- block:
|
||||||
|
- name: Create a Token
|
||||||
|
tower_token:
|
||||||
|
description: '{{ token_description }}'
|
||||||
|
scope: "write"
|
||||||
|
state: present
|
||||||
|
register: new_token
|
||||||
|
|
||||||
|
- name: Validate our token works by token
|
||||||
|
tower_job_list:
|
||||||
|
tower_oauthtoken: "{{ tower_token.token }}"
|
||||||
|
register: job_list
|
||||||
|
|
||||||
|
- name: Validate out token works by object
|
||||||
|
tower_job_list:
|
||||||
|
tower_oauthtoken: "{{ tower_token }}"
|
||||||
|
register: job_list
|
||||||
|
|
||||||
|
always:
|
||||||
|
- name: Delete our Token with our own token
|
||||||
|
tower_token:
|
||||||
|
existing_token: "{{ tower_token }}"
|
||||||
|
tower_oauthtoken: "{{ tower_token }}"
|
||||||
|
state: absent
|
||||||
|
when: tower_token is defined
|
||||||
|
register: results
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is changed or results is skipped
|
||||||
|
|
||||||
|
- block:
|
||||||
|
- name: Create a second token
|
||||||
|
tower_token:
|
||||||
|
description: '{{ token_description }}'
|
||||||
|
scope: "write"
|
||||||
|
state: present
|
||||||
|
register: results
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is changed
|
||||||
|
|
||||||
|
always:
|
||||||
|
- name: Delete the second Token with our own token
|
||||||
|
tower_token:
|
||||||
|
existing_token_id: "{{ tower_token['id'] }}"
|
||||||
|
tower_oauthtoken: "{{ tower_token }}"
|
||||||
|
state: absent
|
||||||
|
when: tower_token is defined
|
||||||
|
register: results
|
||||||
|
|
||||||
|
- assert:
|
||||||
|
that:
|
||||||
|
- results is changed or resuslts is skipped
|
||||||
@@ -1 +1 @@
|
|||||||
11.2.0
|
12.0.0
|
||||||
|
|||||||
@@ -218,7 +218,6 @@ Each Tower instance is made up of several different services working collaborati
|
|||||||
* **Callback Receiver** - Receives job events that result from running Ansible jobs.
|
* **Callback Receiver** - Receives job events that result from running Ansible jobs.
|
||||||
* **Celery** - The worker queue that processes and runs all jobs.
|
* **Celery** - The worker queue that processes and runs all jobs.
|
||||||
* **Redis** - this is used as a queue for AWX to process ansible playbook callback events.
|
* **Redis** - this is used as a queue for AWX to process ansible playbook callback events.
|
||||||
* **Memcached** - A local caching service for the instance it lives on.
|
|
||||||
|
|
||||||
Tower is configured in such a way that if any of these services or their components fail, then all services are restarted. If these fail sufficiently (often in a short span of time), then the entire instance will be placed offline in an automated fashion in order to allow remediation without causing unexpected behavior.
|
Tower is configured in such a way that if any of these services or their components fail, then all services are restarted. If these fail sufficiently (often in a short span of time), then the entire instance will be placed offline in an automated fashion in order to allow remediation without causing unexpected behavior.
|
||||||
|
|
||||||
|
|||||||
26
docs/licenses/django-redis.txt
Normal file
26
docs/licenses/django-redis.txt
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
Copyright (c) 2011-2016 Andrey Antukh <niwi@niwi.nz>
|
||||||
|
Copyright (c) 2011 Sean Bleier
|
||||||
|
|
||||||
|
All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions
|
||||||
|
are met:
|
||||||
|
1. Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer in the
|
||||||
|
documentation and/or other materials provided with the distribution.
|
||||||
|
3. The name of the author may not be used to endorse or promote products
|
||||||
|
derived from this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
|
||||||
|
IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
|
||||||
|
OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
|
||||||
|
IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||||
|
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
|
||||||
|
NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||||
|
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||||
|
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
|
||||||
|
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
@@ -1,556 +0,0 @@
|
|||||||
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
|
|
||||||
--------------------------------------------
|
|
||||||
|
|
||||||
1. This LICENSE AGREEMENT is between the Python Software Foundation
|
|
||||||
("PSF"), and the Individual or Organization ("Licensee") accessing and
|
|
||||||
otherwise using this software ("Python") in source or binary form and
|
|
||||||
its associated documentation.
|
|
||||||
|
|
||||||
2. Subject to the terms and conditions of this License Agreement, PSF
|
|
||||||
hereby grants Licensee a nonexclusive, royalty-free, world-wide
|
|
||||||
license to reproduce, analyze, test, perform and/or display publicly,
|
|
||||||
prepare derivative works, distribute, and otherwise use Python
|
|
||||||
alone or in any derivative version, provided, however, that PSF's
|
|
||||||
License Agreement and PSF's notice of copyright, i.e., "Copyright (c)
|
|
||||||
2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008 Python Software Foundation;
|
|
||||||
All Rights Reserved" are retained in Python alone or in any derivative
|
|
||||||
version prepared by Licensee.
|
|
||||||
|
|
||||||
3. In the event Licensee prepares a derivative work that is based on
|
|
||||||
or incorporates Python or any part thereof, and wants to make
|
|
||||||
the derivative work available to others as provided herein, then
|
|
||||||
Licensee hereby agrees to include in any such work a brief summary of
|
|
||||||
the changes made to Python.
|
|
||||||
|
|
||||||
4. PSF is making Python available to Licensee on an "AS IS"
|
|
||||||
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
|
||||||
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
|
|
||||||
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
|
||||||
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
|
|
||||||
INFRINGE ANY THIRD PARTY RIGHTS.
|
|
||||||
|
|
||||||
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
|
||||||
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
|
||||||
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
|
|
||||||
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
|
||||||
|
|
||||||
6. This License Agreement will automatically terminate upon a material
|
|
||||||
breach of its terms and conditions.
|
|
||||||
|
|
||||||
7. Nothing in this License Agreement shall be deemed to create any
|
|
||||||
relationship of agency, partnership, or joint venture between PSF and
|
|
||||||
Licensee. This License Agreement does not grant permission to use PSF
|
|
||||||
trademarks or trade name in a trademark sense to endorse or promote
|
|
||||||
products or services of Licensee, or any third party.
|
|
||||||
|
|
||||||
8. By copying, installing or otherwise using Python, Licensee
|
|
||||||
agrees to be bound by the terms and conditions of this License
|
|
||||||
Agreement.
|
|
||||||
|
|
||||||
|
|
||||||
BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
|
|
||||||
-------------------------------------------
|
|
||||||
|
|
||||||
BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
|
|
||||||
|
|
||||||
1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
|
|
||||||
office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
|
|
||||||
Individual or Organization ("Licensee") accessing and otherwise using
|
|
||||||
this software in source or binary form and its associated
|
|
||||||
documentation ("the Software").
|
|
||||||
|
|
||||||
2. Subject to the terms and conditions of this BeOpen Python License
|
|
||||||
Agreement, BeOpen hereby grants Licensee a non-exclusive,
|
|
||||||
royalty-free, world-wide license to reproduce, analyze, test, perform
|
|
||||||
and/or display publicly, prepare derivative works, distribute, and
|
|
||||||
otherwise use the Software alone or in any derivative version,
|
|
||||||
provided, however, that the BeOpen Python License is retained in the
|
|
||||||
Software, alone or in any derivative version prepared by Licensee.
|
|
||||||
|
|
||||||
3. BeOpen is making the Software available to Licensee on an "AS IS"
|
|
||||||
basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
|
||||||
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
|
|
||||||
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
|
||||||
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
|
|
||||||
INFRINGE ANY THIRD PARTY RIGHTS.
|
|
||||||
|
|
||||||
4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
|
|
||||||
SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
|
|
||||||
AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
|
|
||||||
DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
|
||||||
|
|
||||||
5. This License Agreement will automatically terminate upon a material
|
|
||||||
breach of its terms and conditions.
|
|
||||||
|
|
||||||
6. This License Agreement shall be governed by and interpreted in all
|
|
||||||
respects by the law of the State of California, excluding conflict of
|
|
||||||
law provisions. Nothing in this License Agreement shall be deemed to
|
|
||||||
create any relationship of agency, partnership, or joint venture
|
|
||||||
between BeOpen and Licensee. This License Agreement does not grant
|
|
||||||
permission to use BeOpen trademarks or trade names in a trademark
|
|
||||||
sense to endorse or promote products or services of Licensee, or any
|
|
||||||
third party. As an exception, the "BeOpen Python" logos available at
|
|
||||||
http://www.pythonlabs.com/logos.html may be used according to the
|
|
||||||
permissions granted on that web page.
|
|
||||||
|
|
||||||
7. By copying, installing or otherwise using the software, Licensee
|
|
||||||
agrees to be bound by the terms and conditions of this License
|
|
||||||
Agreement.
|
|
||||||
|
|
||||||
|
|
||||||
CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
|
|
||||||
---------------------------------------
|
|
||||||
|
|
||||||
1. This LICENSE AGREEMENT is between the Corporation for National
|
|
||||||
Research Initiatives, having an office at 1895 Preston White Drive,
|
|
||||||
Reston, VA 20191 ("CNRI"), and the Individual or Organization
|
|
||||||
("Licensee") accessing and otherwise using Python 1.6.1 software in
|
|
||||||
source or binary form and its associated documentation.
|
|
||||||
|
|
||||||
2. Subject to the terms and conditions of this License Agreement, CNRI
|
|
||||||
hereby grants Licensee a nonexclusive, royalty-free, world-wide
|
|
||||||
license to reproduce, analyze, test, perform and/or display publicly,
|
|
||||||
prepare derivative works, distribute, and otherwise use Python 1.6.1
|
|
||||||
alone or in any derivative version, provided, however, that CNRI's
|
|
||||||
License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
|
|
||||||
1995-2001 Corporation for National Research Initiatives; All Rights
|
|
||||||
Reserved" are retained in Python 1.6.1 alone or in any derivative
|
|
||||||
version prepared by Licensee. Alternately, in lieu of CNRI's License
|
|
||||||
Agreement, Licensee may substitute the following text (omitting the
|
|
||||||
quotes): "Python 1.6.1 is made available subject to the terms and
|
|
||||||
conditions in CNRI's License Agreement. This Agreement together with
|
|
||||||
Python 1.6.1 may be located on the Internet using the following
|
|
||||||
unique, persistent identifier (known as a handle): 1895.22/1013. This
|
|
||||||
Agreement may also be obtained from a proxy server on the Internet
|
|
||||||
using the following URL: http://hdl.handle.net/1895.22/1013".
|
|
||||||
|
|
||||||
3. In the event Licensee prepares a derivative work that is based on
|
|
||||||
or incorporates Python 1.6.1 or any part thereof, and wants to make
|
|
||||||
the derivative work available to others as provided herein, then
|
|
||||||
Licensee hereby agrees to include in any such work a brief summary of
|
|
||||||
the changes made to Python 1.6.1.
|
|
||||||
|
|
||||||
4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
|
|
||||||
basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
|
||||||
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
|
|
||||||
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
|
||||||
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
|
|
||||||
INFRINGE ANY THIRD PARTY RIGHTS.
|
|
||||||
|
|
||||||
5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
|
||||||
1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
|
||||||
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
|
|
||||||
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
|
||||||
|
|
||||||
6. This License Agreement will automatically terminate upon a material
|
|
||||||
breach of its terms and conditions.
|
|
||||||
|
|
||||||
7. This License Agreement shall be governed by the federal
|
|
||||||
intellectual property law of the United States, including without
|
|
||||||
limitation the federal copyright law, and, to the extent such
|
|
||||||
U.S. federal law does not apply, by the law of the Commonwealth of
|
|
||||||
Virginia, excluding Virginia's conflict of law provisions.
|
|
||||||
Notwithstanding the foregoing, with regard to derivative works based
|
|
||||||
on Python 1.6.1 that incorporate non-separable material that was
|
|
||||||
previously distributed under the GNU General Public License (GPL), the
|
|
||||||
law of the Commonwealth of Virginia shall govern this License
|
|
||||||
Agreement only as to issues arising under or with respect to
|
|
||||||
Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
|
|
||||||
License Agreement shall be deemed to create any relationship of
|
|
||||||
agency, partnership, or joint venture between CNRI and Licensee. This
|
|
||||||
License Agreement does not grant permission to use CNRI trademarks or
|
|
||||||
trade name in a trademark sense to endorse or promote products or
|
|
||||||
services of Licensee, or any third party.
|
|
||||||
|
|
||||||
8. By clicking on the "ACCEPT" button where indicated, or by copying,
|
|
||||||
installing or otherwise using Python 1.6.1, Licensee agrees to be
|
|
||||||
bound by the terms and conditions of this License Agreement.
|
|
||||||
|
|
||||||
ACCEPT
|
|
||||||
|
|
||||||
|
|
||||||
CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
|
|
||||||
--------------------------------------------------
|
|
||||||
|
|
||||||
Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
|
|
||||||
The Netherlands. All rights reserved.
|
|
||||||
|
|
||||||
Permission to use, copy, modify, and distribute this software and its
|
|
||||||
documentation for any purpose and without fee is hereby granted,
|
|
||||||
provided that the above copyright notice appear in all copies and that
|
|
||||||
both that copyright notice and this permission notice appear in
|
|
||||||
supporting documentation, and that the name of Stichting Mathematisch
|
|
||||||
Centrum or CWI not be used in advertising or publicity pertaining to
|
|
||||||
distribution of the software without specific, written prior
|
|
||||||
permission.
|
|
||||||
|
|
||||||
STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
|
|
||||||
THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
|
|
||||||
FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
|
|
||||||
FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
|
||||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
|
||||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
|
|
||||||
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
|
||||||
|
|
||||||
This copy of Python includes a copy of bzip2, which is licensed under the following terms:
|
|
||||||
|
|
||||||
|
|
||||||
This program, "bzip2", the associated library "libbzip2", and all
|
|
||||||
documentation, are copyright (C) 1996-2005 Julian R Seward. All
|
|
||||||
rights reserved.
|
|
||||||
|
|
||||||
Redistribution and use in source and binary forms, with or without
|
|
||||||
modification, are permitted provided that the following conditions
|
|
||||||
are met:
|
|
||||||
|
|
||||||
1. Redistributions of source code must retain the above copyright
|
|
||||||
notice, this list of conditions and the following disclaimer.
|
|
||||||
|
|
||||||
2. The origin of this software must not be misrepresented; you must
|
|
||||||
not claim that you wrote the original software. If you use this
|
|
||||||
software in a product, an acknowledgment in the product
|
|
||||||
documentation would be appreciated but is not required.
|
|
||||||
|
|
||||||
3. Altered source versions must be plainly marked as such, and must
|
|
||||||
not be misrepresented as being the original software.
|
|
||||||
|
|
||||||
4. The name of the author may not be used to endorse or promote
|
|
||||||
products derived from this software without specific prior written
|
|
||||||
permission.
|
|
||||||
|
|
||||||
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS
|
|
||||||
OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
|
||||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
|
||||||
ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
|
|
||||||
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
|
||||||
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
|
|
||||||
GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
|
||||||
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
|
|
||||||
WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
|
||||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
|
||||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
|
|
||||||
Julian Seward, Cambridge, UK.
|
|
||||||
jseward@acm.org
|
|
||||||
bzip2/libbzip2 version 1.0.3 of 15 February 2005
|
|
||||||
|
|
||||||
|
|
||||||
This copy of Python includes a copy of db, which is licensed under the following terms:
|
|
||||||
|
|
||||||
/*-
|
|
||||||
* $Id: LICENSE,v 12.1 2005/06/16 20:20:10 bostic Exp $
|
|
||||||
*/
|
|
||||||
|
|
||||||
The following is the license that applies to this copy of the Berkeley DB
|
|
||||||
software. For a license to use the Berkeley DB software under conditions
|
|
||||||
other than those described here, or to purchase support for this software,
|
|
||||||
please contact Sleepycat Software by email at info@sleepycat.com, or on
|
|
||||||
the Web at http://www.sleepycat.com.
|
|
||||||
|
|
||||||
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
|
|
||||||
/*
|
|
||||||
* Copyright (c) 1990-2005
|
|
||||||
* Sleepycat Software. All rights reserved.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
* 1. Redistributions of source code must retain the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in the
|
|
||||||
* documentation and/or other materials provided with the distribution.
|
|
||||||
* 3. Redistributions in any form must be accompanied by information on
|
|
||||||
* how to obtain complete source code for the DB software and any
|
|
||||||
* accompanying software that uses the DB software. The source code
|
|
||||||
* must either be included in the distribution or be available for no
|
|
||||||
* more than the cost of distribution plus a nominal fee, and must be
|
|
||||||
* freely redistributable under reasonable conditions. For an
|
|
||||||
* executable file, complete source code means the source code for all
|
|
||||||
* modules it contains. It does not include source code for modules or
|
|
||||||
* files that typically accompany the major components of the operating
|
|
||||||
* system on which the executable file runs.
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY SLEEPYCAT SOFTWARE ``AS IS'' AND ANY EXPRESS
|
|
||||||
* OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
|
||||||
* WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR
|
|
||||||
* NON-INFRINGEMENT, ARE DISCLAIMED. IN NO EVENT SHALL SLEEPYCAT SOFTWARE
|
|
||||||
* BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
|
|
||||||
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
|
|
||||||
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
|
||||||
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
|
|
||||||
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
|
|
||||||
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
|
|
||||||
* THE POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
*/
|
|
||||||
/*
|
|
||||||
* Copyright (c) 1990, 1993, 1994, 1995
|
|
||||||
* The Regents of the University of California. All rights reserved.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
* 1. Redistributions of source code must retain the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in the
|
|
||||||
* documentation and/or other materials provided with the distribution.
|
|
||||||
* 3. Neither the name of the University nor the names of its contributors
|
|
||||||
* may be used to endorse or promote products derived from this software
|
|
||||||
* without specific prior written permission.
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
|
|
||||||
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
|
||||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
|
||||||
* ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
|
|
||||||
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
|
||||||
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
|
|
||||||
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
|
||||||
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
|
||||||
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
|
|
||||||
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
|
|
||||||
* SUCH DAMAGE.
|
|
||||||
*/
|
|
||||||
/*
|
|
||||||
* Copyright (c) 1995, 1996
|
|
||||||
* The President and Fellows of Harvard University. All rights reserved.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
* 1. Redistributions of source code must retain the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in the
|
|
||||||
* documentation and/or other materials provided with the distribution.
|
|
||||||
* 3. Neither the name of the University nor the names of its contributors
|
|
||||||
* may be used to endorse or promote products derived from this software
|
|
||||||
* without specific prior written permission.
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY HARVARD AND ITS CONTRIBUTORS ``AS IS'' AND
|
|
||||||
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
|
||||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
|
||||||
* ARE DISCLAIMED. IN NO EVENT SHALL HARVARD OR ITS CONTRIBUTORS BE LIABLE
|
|
||||||
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
|
||||||
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
|
|
||||||
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
|
||||||
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
|
||||||
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
|
|
||||||
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
|
|
||||||
* SUCH DAMAGE.
|
|
||||||
*/
|
|
||||||
|
|
||||||
This copy of Python includes a copy of openssl, which is licensed under the following terms:
|
|
||||||
|
|
||||||
|
|
||||||
LICENSE ISSUES
|
|
||||||
==============
|
|
||||||
|
|
||||||
The OpenSSL toolkit stays under a dual license, i.e. both the conditions of
|
|
||||||
the OpenSSL License and the original SSLeay license apply to the toolkit.
|
|
||||||
See below for the actual license texts. Actually both licenses are BSD-style
|
|
||||||
Open Source licenses. In case of any license issues related to OpenSSL
|
|
||||||
please contact openssl-core@openssl.org.
|
|
||||||
|
|
||||||
OpenSSL License
|
|
||||||
---------------
|
|
||||||
|
|
||||||
/* ====================================================================
|
|
||||||
* Copyright (c) 1998-2005 The OpenSSL Project. All rights reserved.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
*
|
|
||||||
* 1. Redistributions of source code must retain the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
*
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in
|
|
||||||
* the documentation and/or other materials provided with the
|
|
||||||
* distribution.
|
|
||||||
*
|
|
||||||
* 3. All advertising materials mentioning features or use of this
|
|
||||||
* software must display the following acknowledgment:
|
|
||||||
* "This product includes software developed by the OpenSSL Project
|
|
||||||
* for use in the OpenSSL Toolkit. (http://www.openssl.org/)"
|
|
||||||
*
|
|
||||||
* 4. The names "OpenSSL Toolkit" and "OpenSSL Project" must not be used to
|
|
||||||
* endorse or promote products derived from this software without
|
|
||||||
* prior written permission. For written permission, please contact
|
|
||||||
* openssl-core@openssl.org.
|
|
||||||
*
|
|
||||||
* 5. Products derived from this software may not be called "OpenSSL"
|
|
||||||
* nor may "OpenSSL" appear in their names without prior written
|
|
||||||
* permission of the OpenSSL Project.
|
|
||||||
*
|
|
||||||
* 6. Redistributions of any form whatsoever must retain the following
|
|
||||||
* acknowledgment:
|
|
||||||
* "This product includes software developed by the OpenSSL Project
|
|
||||||
* for use in the OpenSSL Toolkit (http://www.openssl.org/)"
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
|
|
||||||
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
|
||||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
|
||||||
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
|
|
||||||
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
|
||||||
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
|
|
||||||
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
|
||||||
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
|
||||||
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
|
|
||||||
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
|
|
||||||
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
|
|
||||||
* OF THE POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
* ====================================================================
|
|
||||||
*
|
|
||||||
* This product includes cryptographic software written by Eric Young
|
|
||||||
* (eay@cryptsoft.com). This product includes software written by Tim
|
|
||||||
* Hudson (tjh@cryptsoft.com).
|
|
||||||
*
|
|
||||||
*/
|
|
||||||
|
|
||||||
Original SSLeay License
|
|
||||||
-----------------------
|
|
||||||
|
|
||||||
/* Copyright (C) 1995-1998 Eric Young (eay@cryptsoft.com)
|
|
||||||
* All rights reserved.
|
|
||||||
*
|
|
||||||
* This package is an SSL implementation written
|
|
||||||
* by Eric Young (eay@cryptsoft.com).
|
|
||||||
* The implementation was written so as to conform with Netscapes SSL.
|
|
||||||
*
|
|
||||||
* This library is free for commercial and non-commercial use as long as
|
|
||||||
* the following conditions are aheared to. The following conditions
|
|
||||||
* apply to all code found in this distribution, be it the RC4, RSA,
|
|
||||||
* lhash, DES, etc., code; not just the SSL code. The SSL documentation
|
|
||||||
* included with this distribution is covered by the same copyright terms
|
|
||||||
* except that the holder is Tim Hudson (tjh@cryptsoft.com).
|
|
||||||
*
|
|
||||||
* Copyright remains Eric Young's, and as such any Copyright notices in
|
|
||||||
* the code are not to be removed.
|
|
||||||
* If this package is used in a product, Eric Young should be given attribution
|
|
||||||
* as the author of the parts of the library used.
|
|
||||||
* This can be in the form of a textual message at program startup or
|
|
||||||
* in documentation (online or textual) provided with the package.
|
|
||||||
*
|
|
||||||
* Redistribution and use in source and binary forms, with or without
|
|
||||||
* modification, are permitted provided that the following conditions
|
|
||||||
* are met:
|
|
||||||
* 1. Redistributions of source code must retain the copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer.
|
|
||||||
* 2. Redistributions in binary form must reproduce the above copyright
|
|
||||||
* notice, this list of conditions and the following disclaimer in the
|
|
||||||
* documentation and/or other materials provided with the distribution.
|
|
||||||
* 3. All advertising materials mentioning features or use of this software
|
|
||||||
* must display the following acknowledgement:
|
|
||||||
* "This product includes cryptographic software written by
|
|
||||||
* Eric Young (eay@cryptsoft.com)"
|
|
||||||
* The word 'cryptographic' can be left out if the rouines from the library
|
|
||||||
* being used are not cryptographic related :-).
|
|
||||||
* 4. If you include any Windows specific code (or a derivative thereof) from
|
|
||||||
* the apps directory (application code) you must include an acknowledgement:
|
|
||||||
* "This product includes software written by Tim Hudson (tjh@cryptsoft.com)"
|
|
||||||
*
|
|
||||||
* THIS SOFTWARE IS PROVIDED BY ERIC YOUNG ``AS IS'' AND
|
|
||||||
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
|
||||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
|
||||||
* ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
|
|
||||||
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
|
||||||
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
|
|
||||||
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
|
||||||
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
|
||||||
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
|
|
||||||
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
|
|
||||||
* SUCH DAMAGE.
|
|
||||||
*
|
|
||||||
* The licence and distribution terms for any publically available version or
|
|
||||||
* derivative of this code cannot be changed. i.e. this code cannot simply be
|
|
||||||
* copied and put under another distribution licence
|
|
||||||
* [including the GNU Public Licence.]
|
|
||||||
*/
|
|
||||||
|
|
||||||
|
|
||||||
This copy of Python includes a copy of tcl, which is licensed under the following terms:
|
|
||||||
|
|
||||||
This software is copyrighted by the Regents of the University of
|
|
||||||
California, Sun Microsystems, Inc., Scriptics Corporation, ActiveState
|
|
||||||
Corporation and other parties. The following terms apply to all files
|
|
||||||
associated with the software unless explicitly disclaimed in
|
|
||||||
individual files.
|
|
||||||
|
|
||||||
The authors hereby grant permission to use, copy, modify, distribute,
|
|
||||||
and license this software and its documentation for any purpose, provided
|
|
||||||
that existing copyright notices are retained in all copies and that this
|
|
||||||
notice is included verbatim in any distributions. No written agreement,
|
|
||||||
license, or royalty fee is required for any of the authorized uses.
|
|
||||||
Modifications to this software may be copyrighted by their authors
|
|
||||||
and need not follow the licensing terms described here, provided that
|
|
||||||
the new terms are clearly indicated on the first page of each file where
|
|
||||||
they apply.
|
|
||||||
|
|
||||||
IN NO EVENT SHALL THE AUTHORS OR DISTRIBUTORS BE LIABLE TO ANY PARTY
|
|
||||||
FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES
|
|
||||||
ARISING OUT OF THE USE OF THIS SOFTWARE, ITS DOCUMENTATION, OR ANY
|
|
||||||
DERIVATIVES THEREOF, EVEN IF THE AUTHORS HAVE BEEN ADVISED OF THE
|
|
||||||
POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
|
|
||||||
THE AUTHORS AND DISTRIBUTORS SPECIFICALLY DISCLAIM ANY WARRANTIES,
|
|
||||||
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. THIS SOFTWARE
|
|
||||||
IS PROVIDED ON AN "AS IS" BASIS, AND THE AUTHORS AND DISTRIBUTORS HAVE
|
|
||||||
NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR
|
|
||||||
MODIFICATIONS.
|
|
||||||
|
|
||||||
GOVERNMENT USE: If you are acquiring this software on behalf of the
|
|
||||||
U.S. government, the Government shall have only "Restricted Rights"
|
|
||||||
in the software and related documentation as defined in the Federal
|
|
||||||
Acquisition Regulations (FARs) in Clause 52.227.19 (c) (2). If you
|
|
||||||
are acquiring the software on behalf of the Department of Defense, the
|
|
||||||
software shall be classified as "Commercial Computer Software" and the
|
|
||||||
Government shall have only "Restricted Rights" as defined in Clause
|
|
||||||
252.227-7013 (c) (1) of DFARs. Notwithstanding the foregoing, the
|
|
||||||
authors grant the U.S. Government and others acting in its behalf
|
|
||||||
permission to use and distribute the software in accordance with the
|
|
||||||
terms specified in this license.
|
|
||||||
|
|
||||||
This copy of Python includes a copy of tk, which is licensed under the following terms:
|
|
||||||
|
|
||||||
This software is copyrighted by the Regents of the University of
|
|
||||||
California, Sun Microsystems, Inc., and other parties. The following
|
|
||||||
terms apply to all files associated with the software unless explicitly
|
|
||||||
disclaimed in individual files.
|
|
||||||
|
|
||||||
The authors hereby grant permission to use, copy, modify, distribute,
|
|
||||||
and license this software and its documentation for any purpose, provided
|
|
||||||
that existing copyright notices are retained in all copies and that this
|
|
||||||
notice is included verbatim in any distributions. No written agreement,
|
|
||||||
license, or royalty fee is required for any of the authorized uses.
|
|
||||||
Modifications to this software may be copyrighted by their authors
|
|
||||||
and need not follow the licensing terms described here, provided that
|
|
||||||
the new terms are clearly indicated on the first page of each file where
|
|
||||||
they apply.
|
|
||||||
|
|
||||||
IN NO EVENT SHALL THE AUTHORS OR DISTRIBUTORS BE LIABLE TO ANY PARTY
|
|
||||||
FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES
|
|
||||||
ARISING OUT OF THE USE OF THIS SOFTWARE, ITS DOCUMENTATION, OR ANY
|
|
||||||
DERIVATIVES THEREOF, EVEN IF THE AUTHORS HAVE BEEN ADVISED OF THE
|
|
||||||
POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
|
|
||||||
THE AUTHORS AND DISTRIBUTORS SPECIFICALLY DISCLAIM ANY WARRANTIES,
|
|
||||||
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. THIS SOFTWARE
|
|
||||||
IS PROVIDED ON AN "AS IS" BASIS, AND THE AUTHORS AND DISTRIBUTORS HAVE
|
|
||||||
NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR
|
|
||||||
MODIFICATIONS.
|
|
||||||
|
|
||||||
GOVERNMENT USE: If you are acquiring this software on behalf of the
|
|
||||||
U.S. government, the Government shall have only "Restricted Rights"
|
|
||||||
in the software and related documentation as defined in the Federal
|
|
||||||
Acquisition Regulations (FARs) in Clause 52.227.19 (c) (2). If you
|
|
||||||
are acquiring the software on behalf of the Department of Defense, the
|
|
||||||
software shall be classified as "Commercial Computer Software" and the
|
|
||||||
Government shall have only "Restricted Rights" as defined in Clause
|
|
||||||
252.227-7013 (c) (1) of DFARs. Notwithstanding the foregoing, the
|
|
||||||
authors grant the U.S. Government and others acting in its behalf
|
|
||||||
permission to use and distribute the software in accordance with the
|
|
||||||
terms specified in this license.
|
|
||||||
@@ -12,6 +12,8 @@ ANSIBLE_REMOTE_TEMP=/tmp ANSIBLE_LOCAL_TEMP=/tmp ansible -i "127.0.0.1," -c loca
|
|||||||
|
|
||||||
if [ -z "$AWX_SKIP_MIGRATIONS" ]; then
|
if [ -z "$AWX_SKIP_MIGRATIONS" ]; then
|
||||||
awx-manage migrate --noinput
|
awx-manage migrate --noinput
|
||||||
|
awx-manage provision_instance --hostname=$(hostname)
|
||||||
|
awx-manage register_queue --queuename=tower --instance_percent=100
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if [ ! -z "$AWX_ADMIN_USER" ]&&[ ! -z "$AWX_ADMIN_PASSWORD" ]; then
|
if [ ! -z "$AWX_ADMIN_USER" ]&&[ ! -z "$AWX_ADMIN_PASSWORD" ]; then
|
||||||
@@ -21,8 +23,6 @@ if [ ! -z "$AWX_ADMIN_USER" ]&&[ ! -z "$AWX_ADMIN_PASSWORD" ]; then
|
|||||||
{% endif %}
|
{% endif %}
|
||||||
fi
|
fi
|
||||||
echo 'from django.conf import settings; x = settings.AWX_TASK_ENV; x["HOME"] = "/var/lib/awx"; settings.AWX_TASK_ENV = x' | awx-manage shell
|
echo 'from django.conf import settings; x = settings.AWX_TASK_ENV; x["HOME"] = "/var/lib/awx"; settings.AWX_TASK_ENV = x' | awx-manage shell
|
||||||
awx-manage provision_instance --hostname=$(hostname)
|
|
||||||
awx-manage register_queue --queuename=tower --instance_percent=100
|
|
||||||
|
|
||||||
unset $(cut -d = -f -1 /etc/tower/conf.d/environment.sh)
|
unset $(cut -d = -f -1 /etc/tower/conf.d/environment.sh)
|
||||||
|
|
||||||
|
|||||||
@@ -38,15 +38,6 @@ kubernetes_redis_image: "redis"
|
|||||||
kubernetes_redis_image_tag: "latest"
|
kubernetes_redis_image_tag: "latest"
|
||||||
kubernetes_redis_config_mount_path: "/usr/local/etc/redis/redis.conf"
|
kubernetes_redis_config_mount_path: "/usr/local/etc/redis/redis.conf"
|
||||||
|
|
||||||
memcached_mem_request: 1
|
|
||||||
memcached_cpu_request: 500
|
|
||||||
memcached_security_context_enabled: true
|
|
||||||
memcached_security_context_privileged: false
|
|
||||||
memcached_security_context_user: 1001
|
|
||||||
|
|
||||||
kubernetes_memcached_version: "latest"
|
|
||||||
kubernetes_memcached_image: "memcached"
|
|
||||||
|
|
||||||
openshift_pg_emptydir: false
|
openshift_pg_emptydir: false
|
||||||
openshift_pg_pvc_name: postgresql
|
openshift_pg_pvc_name: postgresql
|
||||||
|
|
||||||
|
|||||||
@@ -231,9 +231,6 @@ spec:
|
|||||||
- name: {{ kubernetes_deployment_name }}-redis-socket
|
- name: {{ kubernetes_deployment_name }}-redis-socket
|
||||||
mountPath: "/var/run/redis"
|
mountPath: "/var/run/redis"
|
||||||
|
|
||||||
- name: {{ kubernetes_deployment_name }}-memcached-socket
|
|
||||||
mountPath: "/var/run/memcached"
|
|
||||||
|
|
||||||
resources:
|
resources:
|
||||||
requests:
|
requests:
|
||||||
memory: "{{ web_mem_request }}Gi"
|
memory: "{{ web_mem_request }}Gi"
|
||||||
@@ -310,9 +307,6 @@ spec:
|
|||||||
|
|
||||||
- name: {{ kubernetes_deployment_name }}-redis-socket
|
- name: {{ kubernetes_deployment_name }}-redis-socket
|
||||||
mountPath: "/var/run/redis"
|
mountPath: "/var/run/redis"
|
||||||
|
|
||||||
- name: {{ kubernetes_deployment_name }}-memcached-socket
|
|
||||||
mountPath: "/var/run/memcached"
|
|
||||||
env:
|
env:
|
||||||
- name: SUPERVISOR_WEB_CONFIG_PATH
|
- name: SUPERVISOR_WEB_CONFIG_PATH
|
||||||
value: "/etc/supervisord.conf"
|
value: "/etc/supervisord.conf"
|
||||||
@@ -376,40 +370,6 @@ spec:
|
|||||||
{% endif %}
|
{% endif %}
|
||||||
{% if redis_cpu_limit is defined %}
|
{% if redis_cpu_limit is defined %}
|
||||||
cpu: "{{ redis_cpu_limit }}m"
|
cpu: "{{ redis_cpu_limit }}m"
|
||||||
{% endif %}
|
|
||||||
- name: {{ kubernetes_deployment_name }}-memcached
|
|
||||||
{% if memcached_security_context_enabled is defined and memcached_security_context_enabled | bool %}
|
|
||||||
securityContext:
|
|
||||||
{% if memcached_security_context_privileged is defined %}
|
|
||||||
privileged: {{ memcached_security_context_privileged }}
|
|
||||||
{% endif %}
|
|
||||||
{% if memcached_security_context_user is defined %}
|
|
||||||
runAsUser: {{ memcached_security_context_user }}
|
|
||||||
{% endif %}
|
|
||||||
{% endif %}
|
|
||||||
image: "{{ kubernetes_memcached_image }}:{{ kubernetes_memcached_version }}"
|
|
||||||
imagePullPolicy: Always
|
|
||||||
command:
|
|
||||||
- 'memcached'
|
|
||||||
- '-s'
|
|
||||||
- '/var/run/memcached/memcached.sock'
|
|
||||||
- '-a'
|
|
||||||
- '0666'
|
|
||||||
volumeMounts:
|
|
||||||
- name: {{ kubernetes_deployment_name }}-memcached-socket
|
|
||||||
mountPath: "/var/run/memcached"
|
|
||||||
resources:
|
|
||||||
requests:
|
|
||||||
memory: "{{ memcached_mem_request }}Gi"
|
|
||||||
cpu: "{{ memcached_cpu_request }}m"
|
|
||||||
{% if memcached_mem_limit is defined or memcached_cpu_limit is defined %}
|
|
||||||
limits:
|
|
||||||
{% endif %}
|
|
||||||
{% if memcached_mem_limit is defined %}
|
|
||||||
memory: "{{ memcached_mem_limit }}Gi"
|
|
||||||
{% endif %}
|
|
||||||
{% if memcached_cpu_limit is defined %}
|
|
||||||
cpu: "{{ memcached_cpu_limit }}m"
|
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if tolerations is defined %}
|
{% if tolerations is defined %}
|
||||||
tolerations:
|
tolerations:
|
||||||
@@ -516,9 +476,6 @@ spec:
|
|||||||
- name: {{ kubernetes_deployment_name }}-redis-socket
|
- name: {{ kubernetes_deployment_name }}-redis-socket
|
||||||
emptyDir: {}
|
emptyDir: {}
|
||||||
|
|
||||||
- name: {{ kubernetes_deployment_name }}-memcached-socket
|
|
||||||
emptyDir: {}
|
|
||||||
|
|
||||||
---
|
---
|
||||||
apiVersion: v1
|
apiVersion: v1
|
||||||
kind: Service
|
kind: Service
|
||||||
@@ -591,4 +548,4 @@ spec:
|
|||||||
name: {{ kubernetes_deployment_name }}-web-svc
|
name: {{ kubernetes_deployment_name }}-web-svc
|
||||||
weight: 100
|
weight: 100
|
||||||
wildcardPolicy: None
|
wildcardPolicy: None
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|||||||
@@ -7,7 +7,4 @@ redis_image: "redis"
|
|||||||
postgresql_version: "10"
|
postgresql_version: "10"
|
||||||
postgresql_image: "postgres:{{postgresql_version}}"
|
postgresql_image: "postgres:{{postgresql_version}}"
|
||||||
|
|
||||||
memcached_image: "memcached"
|
|
||||||
memcached_version: "alpine"
|
|
||||||
|
|
||||||
compose_start_containers: true
|
compose_start_containers: true
|
||||||
|
|||||||
@@ -10,12 +10,6 @@
|
|||||||
state: directory
|
state: directory
|
||||||
mode: 0777
|
mode: 0777
|
||||||
|
|
||||||
- name: Create Memcached socket directory
|
|
||||||
file:
|
|
||||||
path: "{{ docker_compose_dir }}/memcached_socket"
|
|
||||||
state: directory
|
|
||||||
mode: 0777
|
|
||||||
|
|
||||||
- name: Create Docker Compose Configuration
|
- name: Create Docker Compose Configuration
|
||||||
template:
|
template:
|
||||||
src: "{{ item }}.j2"
|
src: "{{ item }}.j2"
|
||||||
|
|||||||
@@ -7,7 +7,6 @@ services:
|
|||||||
container_name: awx_web
|
container_name: awx_web
|
||||||
depends_on:
|
depends_on:
|
||||||
- redis
|
- redis
|
||||||
- memcached
|
|
||||||
{% if pg_hostname is not defined %}
|
{% if pg_hostname is not defined %}
|
||||||
- postgres
|
- postgres
|
||||||
{% endif %}
|
{% endif %}
|
||||||
@@ -32,7 +31,6 @@ services:
|
|||||||
- "{{ docker_compose_dir }}/credentials.py:/etc/tower/conf.d/credentials.py"
|
- "{{ docker_compose_dir }}/credentials.py:/etc/tower/conf.d/credentials.py"
|
||||||
- "{{ docker_compose_dir }}/nginx.conf:/etc/nginx/nginx.conf:ro"
|
- "{{ docker_compose_dir }}/nginx.conf:/etc/nginx/nginx.conf:ro"
|
||||||
- "{{ docker_compose_dir }}/redis_socket:/var/run/redis/:rw"
|
- "{{ docker_compose_dir }}/redis_socket:/var/run/redis/:rw"
|
||||||
- "{{ docker_compose_dir }}/memcached_socket:/var/run/memcached/:rw"
|
|
||||||
{% if project_data_dir is defined %}
|
{% if project_data_dir is defined %}
|
||||||
- "{{ project_data_dir +':/var/lib/awx/projects:rw' }}"
|
- "{{ project_data_dir +':/var/lib/awx/projects:rw' }}"
|
||||||
{% endif %}
|
{% endif %}
|
||||||
@@ -76,7 +74,6 @@ services:
|
|||||||
container_name: awx_task
|
container_name: awx_task
|
||||||
depends_on:
|
depends_on:
|
||||||
- redis
|
- redis
|
||||||
- memcached
|
|
||||||
- web
|
- web
|
||||||
{% if pg_hostname is not defined %}
|
{% if pg_hostname is not defined %}
|
||||||
- postgres
|
- postgres
|
||||||
@@ -93,7 +90,6 @@ services:
|
|||||||
- "{{ docker_compose_dir }}/environment.sh:/etc/tower/conf.d/environment.sh"
|
- "{{ docker_compose_dir }}/environment.sh:/etc/tower/conf.d/environment.sh"
|
||||||
- "{{ docker_compose_dir }}/credentials.py:/etc/tower/conf.d/credentials.py"
|
- "{{ docker_compose_dir }}/credentials.py:/etc/tower/conf.d/credentials.py"
|
||||||
- "{{ docker_compose_dir }}/redis_socket:/var/run/redis/:rw"
|
- "{{ docker_compose_dir }}/redis_socket:/var/run/redis/:rw"
|
||||||
- "{{ docker_compose_dir }}/memcached_socket:/var/run/memcached/:rw"
|
|
||||||
{% if project_data_dir is defined %}
|
{% if project_data_dir is defined %}
|
||||||
- "{{ project_data_dir +':/var/lib/awx/projects:rw' }}"
|
- "{{ project_data_dir +':/var/lib/awx/projects:rw' }}"
|
||||||
{% endif %}
|
{% endif %}
|
||||||
@@ -142,19 +138,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "{{ docker_compose_dir }}/redis.conf:/usr/local/etc/redis/redis.conf:ro"
|
- "{{ docker_compose_dir }}/redis.conf:/usr/local/etc/redis/redis.conf:ro"
|
||||||
- "{{ docker_compose_dir }}/redis_socket:/var/run/redis/:rw"
|
- "{{ docker_compose_dir }}/redis_socket:/var/run/redis/:rw"
|
||||||
- "{{ docker_compose_dir }}/memcached_socket:/var/run/memcached/:rw"
|
|
||||||
|
|
||||||
memcached:
|
|
||||||
image: "{{ memcached_image }}:{{ memcached_version }}"
|
|
||||||
container_name: awx_memcached
|
|
||||||
command: ["-s", "/var/run/memcached/memcached.sock", "-a", "0666"]
|
|
||||||
restart: unless-stopped
|
|
||||||
environment:
|
|
||||||
http_proxy: {{ http_proxy | default('') }}
|
|
||||||
https_proxy: {{ https_proxy | default('') }}
|
|
||||||
no_proxy: {{ no_proxy | default('') }}
|
|
||||||
volumes:
|
|
||||||
- "{{ docker_compose_dir }}/memcached_socket:/var/run/memcached/:rw"
|
|
||||||
|
|
||||||
{% if pg_hostname is not defined %}
|
{% if pg_hostname is not defined %}
|
||||||
postgres:
|
postgres:
|
||||||
|
|||||||
@@ -17,6 +17,7 @@ django-polymorphic
|
|||||||
django-pglocks
|
django-pglocks
|
||||||
django-qsstats-magic
|
django-qsstats-magic
|
||||||
django-radius==1.3.3 # FIX auth does not work with later versions
|
django-radius==1.3.3 # FIX auth does not work with later versions
|
||||||
|
django-redis
|
||||||
django-solo
|
django-solo
|
||||||
django-split-settings
|
django-split-settings
|
||||||
django-taggit
|
django-taggit
|
||||||
@@ -33,7 +34,6 @@ prometheus_client
|
|||||||
psycopg2
|
psycopg2
|
||||||
pygerduty
|
pygerduty
|
||||||
pyparsing
|
pyparsing
|
||||||
python-memcached
|
|
||||||
python-radius
|
python-radius
|
||||||
python3-saml
|
python3-saml
|
||||||
pyyaml>=5.3.1 # minimum version to pull in new pyyaml for CVE-2017-18342
|
pyyaml>=5.3.1 # minimum version to pull in new pyyaml for CVE-2017-18342
|
||||||
|
|||||||
@@ -32,6 +32,7 @@ django-oauth-toolkit==1.1.3 # via -r /awx_devel/requirements/requirements.in
|
|||||||
django-pglocks==1.0.4 # via -r /awx_devel/requirements/requirements.in
|
django-pglocks==1.0.4 # via -r /awx_devel/requirements/requirements.in
|
||||||
django-polymorphic==2.1.2 # via -r /awx_devel/requirements/requirements.in
|
django-polymorphic==2.1.2 # via -r /awx_devel/requirements/requirements.in
|
||||||
django-qsstats-magic==1.1.0 # via -r /awx_devel/requirements/requirements.in
|
django-qsstats-magic==1.1.0 # via -r /awx_devel/requirements/requirements.in
|
||||||
|
django-redis==4.5.0
|
||||||
django-radius==1.3.3 # via -r /awx_devel/requirements/requirements.in
|
django-radius==1.3.3 # via -r /awx_devel/requirements/requirements.in
|
||||||
django-solo==1.1.3 # via -r /awx_devel/requirements/requirements.in
|
django-solo==1.1.3 # via -r /awx_devel/requirements/requirements.in
|
||||||
django-split-settings==1.0.0 # via -r /awx_devel/requirements/requirements.in
|
django-split-settings==1.0.0 # via -r /awx_devel/requirements/requirements.in
|
||||||
@@ -93,7 +94,6 @@ pyrsistent==0.15.7 # via jsonschema
|
|||||||
python-daemon==2.2.4 # via ansible-runner
|
python-daemon==2.2.4 # via ansible-runner
|
||||||
python-dateutil==2.8.1 # via adal, kubernetes
|
python-dateutil==2.8.1 # via adal, kubernetes
|
||||||
python-ldap==3.2.0 # via django-auth-ldap
|
python-ldap==3.2.0 # via django-auth-ldap
|
||||||
python-memcached==1.59 # via -r /awx_devel/requirements/requirements.in
|
|
||||||
python-radius==1.0 # via -r /awx_devel/requirements/requirements.in
|
python-radius==1.0 # via -r /awx_devel/requirements/requirements.in
|
||||||
python-string-utils==1.0.0 # via openshift
|
python-string-utils==1.0.0 # via openshift
|
||||||
python3-openid==3.1.0 # via social-auth-core
|
python3-openid==3.1.0 # via social-auth-core
|
||||||
|
|||||||
@@ -30,7 +30,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "../:/awx_devel"
|
- "../:/awx_devel"
|
||||||
- "./redis/redis_socket_ha_1:/var/run/redis/"
|
- "./redis/redis_socket_ha_1:/var/run/redis/"
|
||||||
- "./memcached/:/var/run/memcached"
|
|
||||||
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
||||||
ports:
|
ports:
|
||||||
- "5899-5999:5899-5999"
|
- "5899-5999:5899-5999"
|
||||||
@@ -50,7 +49,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "../:/awx_devel"
|
- "../:/awx_devel"
|
||||||
- "./redis/redis_socket_ha_2:/var/run/redis/"
|
- "./redis/redis_socket_ha_2:/var/run/redis/"
|
||||||
- "./memcached/:/var/run/memcached"
|
|
||||||
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
||||||
ports:
|
ports:
|
||||||
- "7899-7999:7899-7999"
|
- "7899-7999:7899-7999"
|
||||||
@@ -70,7 +68,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "../:/awx_devel"
|
- "../:/awx_devel"
|
||||||
- "./redis/redis_socket_ha_3:/var/run/redis/"
|
- "./redis/redis_socket_ha_3:/var/run/redis/"
|
||||||
- "./memcached/:/var/run/memcached"
|
|
||||||
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
||||||
ports:
|
ports:
|
||||||
- "8899-8999:8899-8999"
|
- "8899-8999:8899-8999"
|
||||||
@@ -82,8 +79,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
||||||
- "./redis/redis_socket_ha_1:/var/run/redis/"
|
- "./redis/redis_socket_ha_1:/var/run/redis/"
|
||||||
ports:
|
|
||||||
- "63791:63791"
|
|
||||||
redis_2:
|
redis_2:
|
||||||
user: ${CURRENT_UID}
|
user: ${CURRENT_UID}
|
||||||
image: redis:latest
|
image: redis:latest
|
||||||
@@ -92,8 +87,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
||||||
- "./redis/redis_socket_ha_2:/var/run/redis/"
|
- "./redis/redis_socket_ha_2:/var/run/redis/"
|
||||||
ports:
|
|
||||||
- "63792:63792"
|
|
||||||
redis_3:
|
redis_3:
|
||||||
user: ${CURRENT_UID}
|
user: ${CURRENT_UID}
|
||||||
image: redis:latest
|
image: redis:latest
|
||||||
@@ -102,15 +95,6 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
||||||
- "./redis/redis_socket_ha_3:/var/run/redis/"
|
- "./redis/redis_socket_ha_3:/var/run/redis/"
|
||||||
ports:
|
|
||||||
- "63793:63793"
|
|
||||||
postgres:
|
postgres:
|
||||||
image: postgres:10
|
image: postgres:10
|
||||||
container_name: tools_postgres_1
|
container_name: tools_postgres_1
|
||||||
memcached:
|
|
||||||
user: ${CURRENT_UID}
|
|
||||||
image: memcached:alpine
|
|
||||||
container_name: tools_memcached_1
|
|
||||||
command: ["memcached", "-s", "/var/run/memcached/memcached.sock", "-a", "0666"]
|
|
||||||
volumes:
|
|
||||||
- "./memcached/:/var/run/memcached"
|
|
||||||
|
|||||||
@@ -23,7 +23,6 @@ services:
|
|||||||
- "7899-7999:7899-7999" # default port range for sdb-listen
|
- "7899-7999:7899-7999" # default port range for sdb-listen
|
||||||
links:
|
links:
|
||||||
- postgres
|
- postgres
|
||||||
- memcached
|
|
||||||
- redis
|
- redis
|
||||||
# - sync
|
# - sync
|
||||||
# volumes_from:
|
# volumes_from:
|
||||||
@@ -33,8 +32,6 @@ services:
|
|||||||
- "../:/awx_devel"
|
- "../:/awx_devel"
|
||||||
- "../awx/projects/:/var/lib/awx/projects/"
|
- "../awx/projects/:/var/lib/awx/projects/"
|
||||||
- "./redis/redis_socket_standalone:/var/run/redis/"
|
- "./redis/redis_socket_standalone:/var/run/redis/"
|
||||||
- "./memcached/:/var/run/memcached"
|
|
||||||
- "./rsyslog/:/var/lib/awx/rsyslog"
|
|
||||||
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
- "./docker-compose/supervisor.conf:/etc/supervisord.conf"
|
||||||
privileged: true
|
privileged: true
|
||||||
tty: true
|
tty: true
|
||||||
@@ -55,18 +52,9 @@ services:
|
|||||||
POSTGRES_HOST_AUTH_METHOD: trust
|
POSTGRES_HOST_AUTH_METHOD: trust
|
||||||
volumes:
|
volumes:
|
||||||
- "awx_db:/var/lib/postgresql/data"
|
- "awx_db:/var/lib/postgresql/data"
|
||||||
memcached:
|
|
||||||
user: ${CURRENT_UID}
|
|
||||||
image: memcached:alpine
|
|
||||||
container_name: tools_memcached_1
|
|
||||||
command: ["memcached", "-s", "/var/run/memcached/memcached.sock", "-a", "0666"]
|
|
||||||
volumes:
|
|
||||||
- "./memcached/:/var/run/memcached"
|
|
||||||
redis:
|
redis:
|
||||||
image: redis:latest
|
image: redis:latest
|
||||||
container_name: tools_redis_1
|
container_name: tools_redis_1
|
||||||
ports:
|
|
||||||
- "6379:6379"
|
|
||||||
user: ${CURRENT_UID}
|
user: ${CURRENT_UID}
|
||||||
volumes:
|
volumes:
|
||||||
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
- "./redis/redis.conf:/usr/local/etc/redis/redis.conf"
|
||||||
|
|||||||
Reference in New Issue
Block a user