Compare commits

..

18 Commits

Author SHA1 Message Date
Jake Jackson
6389316206 Change artifact name from coverage-report to coverage.xml
small change since the report is not being found. this is one of many small PRs to try and get this working.
2025-10-28 15:39:13 -04:00
Jake Jackson
d1d3a3471b Add new api schema check workflow (#16143)
* add new file to separate out the schema check so that it is no longer
  part of CI check and won't cacuse the whole workflow to fail
* remove old API schema check from ci.yml
2025-10-27 11:59:40 -04:00
Alan Rominger
a53fdaddae Fix f-string in log that is broken (#16132) 2025-10-20 14:23:38 -04:00
Hao Liu
f72591195e Fix migration failure for 0200 (#16135)
Moved the AddField operation before the RunPython operations for 'rename_jts' and 'rename_projects' in migration 0200_template_name_constraint.py. This ensures the new 'org_unique' field exists before related data migrations are executed.

Fix
```
django.db.utils.ProgrammingError: column main_unifiedjobtemplate.org_unique does not exist
```

while applying migration 0200_template_name_constraint.py

when there's a job template or poject with duplicate name in the same org
2025-10-20 08:50:23 -04:00
TVo
0d9483b54c Added Django and API requirements to AWX Contributor Docs for POC (#16093)
* Requirements POC docs from Claude Code eval

* Removed unnecessary reference.

* Excluded custom DRF configurations per @AlanCoding

* Implement review changes from @chrismeyersfsu

---------

Co-authored-by: Peter Braun <pbraun@redhat.com>
2025-10-16 10:38:37 -06:00
jessicamack
f3fd9945d6 Update dependencies (#16122)
* prometheus-client returns an additional value as of v.0.22.0

* add license, remove outdated ones, add new embedded sources

* update requirements and UPGRADE BLOCKERs in README
2025-10-15 15:55:21 +00:00
Jake Jackson
72a42f23d5 Remove 'UI' from PR template component options (#16123)
Removed 'UI' from the list of component names in the PR template.
2025-10-13 10:23:56 -04:00
Jake Jackson
309724b12b Add SonarQube Coverage and report generation (#16112)
* added sonar config file and started cleaning up
* we do not place the report at the root of the repo
* limit scope to only the awx directory and its contents
* update exclusions for things in awx/ that we don't want covered
2025-10-10 10:44:35 -04:00
Seth Foster
300605ff73 Make subscriptions credentials mutually exclusive (#16126)
settings.SUBSCRIPTIONS_USERNAME and
settings.SUBSCRIPTIONS_CLIENT_ID

should be mutually exclusive. This is because
the POST to api/v2/config/attach/ accepts only
a subscription_id, and infers which credentials to
use based on settings. If both are set, it is ambiguous
and can lead to unexpected 400s when attempting
to attach a license.

Signed-off-by: Seth Foster <fosterbseth@gmail.com>
2025-10-09 16:57:58 -04:00
Jake Jackson
77fab1c534 Update dependabot to check python deps (#16127)
* add list item to check python deps daily
* move to weekly after we gain confidence
2025-10-09 19:40:34 +00:00
Chris Meyers
51b2524b25 Gracefully handle hostname change in metrics code
* Previously, we would error out because we assumed that when we got a
  metrics payload from redis, that there was data in it and it was for
  the current host.
* Now, we do not assume that since we got a metrics payload, that is
  well formed and for the current hostname because the hostname could
  have changed and we could have not yet collected metrics for the new
  host.
2025-10-09 14:08:01 -04:00
Chris Coutinho
612e8e7688 Fix duplicate metrics in AWX subsystem_metrics (#15964)
Separate out operation subsystem metrics to fix duplicate error

Remove unnecessary comments

Revert to single subsystem_metrics_* metric with labels

Format via black
2025-10-09 10:28:55 +02:00
Andrea Restle-Lay
0d18308112 [AAP-46830]: Fix AWX CLI authentication with AAP Gateway environments (#16113)
* migrate pr #16081 to new fork

* add test coverage - add clearer error messaging

* update tests to use monkeypatch
2025-10-02 10:06:10 -04:00
Chris Meyers
f51af03424 Create system_administrator rbac role in migration
* We had race conditions with the system_administrator role being
  created just-in-time. Instead of fixing the race condition(s), dodge
  them by ensuring the role always exists
2025-10-02 08:25:40 -04:00
Alan Rominger
622f6ea166 AAP-53980 Disconnect logic to fill in role parents (#15462)
* Disconnect logic to fill in role parents

Get tests passing hopefully

Whatever SonarCloud

* remove role parents/children endpoints and related views

* remove duplicate get_queryset method from RoleTeamsList

---------

Co-authored-by: Peter Braun <pbraun@redhat.com>
2025-10-02 13:06:37 +02:00
Seth Foster
2729076f7f Add basic auth to subscription management API (#16103)
Allow users to do subscription management using
Red Hat username and password.

In basic auth case, the candlepin API
at subscriptions.rhsm.redhat.com will be used instead
of console.redhat.com.

Signed-off-by: Seth Foster <fosterbseth@gmail.com>
2025-10-02 01:06:47 -04:00
Alan Rominger
6db08bfa4e Rewrite the s3 upload step to fix breakage with new Ansible version (#16111)
* Rewrite the s3 upload step to fix breakage with new Ansible version

* Use commit hash for security

* Add the public read flag
2025-09-30 15:47:54 -04:00
Stevenson Michel
ceed41d352 Sharing Credentials Across Organizations (#16106)
* Added tests for cross org sharing of credentials

* added negative testing for sharing of credentials

* added conditions and tests for roleteamslist regarding cross org credentials

* removed redundant codes

* made error message more articulated and specific
2025-09-30 10:44:27 -04:00
47 changed files with 3379 additions and 837 deletions

View File

@@ -17,7 +17,6 @@ in as the first entry for your PR title.
##### COMPONENT NAME
<!--- Name of the module/plugin/module/task -->
- API
- UI
- Collection
- CLI
- Docs

View File

@@ -8,3 +8,10 @@ updates:
labels:
- "docs"
- "dependencies"
- package-ecosystem: "pip"
directory: "requirements/"
schedule:
interval: "daily" #run daily until we trust it, then back this off to weekly
open-pull-requests-limit: 2
labels:
- "dependencies"

66
.github/workflows/api_schema_check.yml vendored Normal file
View File

@@ -0,0 +1,66 @@
---
name: API Schema Change Detection
env:
LC_ALL: "C.UTF-8" # prevent ERROR: Ansible could not initialize the preferred locale: unsupported locale setting
CI_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DEV_DOCKER_OWNER: ${{ github.repository_owner }}
COMPOSE_TAG: ${{ github.base_ref || 'devel' }}
UPSTREAM_REPOSITORY_ID: 91594105
on:
pull_request:
branches:
- devel
- release_**
- feature_**
- stable-**
jobs:
api-schema-detection:
name: Detect API Schema Changes
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
packages: write
contents: read
steps:
- uses: actions/checkout@v4
with:
show-progress: false
fetch-depth: 0
- name: Build awx_devel image for schema check
uses: ./.github/actions/awx_devel_image
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
private-github-key: ${{ secrets.PRIVATE_GITHUB_KEY }}
- name: Detect API schema changes
id: schema-check
continue-on-error: true
run: |
AWX_DOCKER_ARGS='-e GITHUB_ACTIONS' \
AWX_DOCKER_CMD='make detect-schema-change SCHEMA_DIFF_BASE_BRANCH=${{ github.event.pull_request.base.ref }}' \
make docker-runner 2>&1 | tee schema-diff.txt
exit ${PIPESTATUS[0]}
- name: Add schema diff to job summary
if: always()
# show text and if for some reason, it can't be generated, state that it can't be.
run: |
echo "## API Schema Change Detection Results" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
if [ -f schema-diff.txt ]; then
if grep -q "^+" schema-diff.txt || grep -q "^-" schema-diff.txt; then
echo "### Schema changes detected" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo '```diff' >> $GITHUB_STEP_SUMMARY
cat schema-diff.txt >> $GITHUB_STEP_SUMMARY
echo '```' >> $GITHUB_STEP_SUMMARY
else
echo "### No schema changes detected" >> $GITHUB_STEP_SUMMARY
fi
else
echo "### Unable to generate schema diff" >> $GITHUB_STEP_SUMMARY
fi

View File

@@ -38,12 +38,6 @@ jobs:
- name: awx-collection
command: /start_tests.sh test_collection_all
coverage-upload-name: "awx-collection"
- name: api-schema
command: >-
/start_tests.sh detect-schema-change SCHEMA_DIFF_BASE_BRANCH=${{
github.event.pull_request.base.ref || github.ref_name
}}
coverage-upload-name: ""
steps:
- uses: actions/checkout@v4

85
.github/workflows/sonarcloud_pr.yml vendored Normal file
View File

@@ -0,0 +1,85 @@
---
name: SonarQube
on:
workflow_run:
workflows:
- CI
types:
- completed
permissions: read-all
jobs:
sonarqube:
runs-on: ubuntu-latest
if: github.event.workflow_run.conclusion == 'success' && github.event.workflow_run.event == 'pull_request'
steps:
- name: Checkout Code
uses: actions/checkout@v4
with:
fetch-depth: 0
show-progress: false
- name: Download coverage report artifact
uses: actions/download-artifact@v4
with:
name: coverage.xml
path: reports/
github-token: ${{ secrets.GITHUB_TOKEN }}
run-id: ${{ github.event.workflow_run.id }}
- name: Download PR number artifact
uses: actions/download-artifact@v4
with:
name: pr-number
path: .
github-token: ${{ secrets.GITHUB_TOKEN }}
run-id: ${{ github.event.workflow_run.id }}
- name: Extract PR number
run: |
cat pr-number.txt
echo "PR_NUMBER=$(cat pr-number.txt)" >> $GITHUB_ENV
- name: Get PR info
uses: octokit/request-action@v2.x
id: pr_info
with:
route: GET /repos/{repo}/pulls/{number}
repo: ${{ github.event.repository.full_name }}
number: ${{ env.PR_NUMBER }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Set PR info into env
run: |
echo "PR_BASE=${{ fromJson(steps.pr_info.outputs.data).base.ref }}" >> $GITHUB_ENV
echo "PR_HEAD=${{ fromJson(steps.pr_info.outputs.data).head.ref }}" >> $GITHUB_ENV
- name: Add base branch
run: |
gh pr checkout ${{ env.PR_NUMBER }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Extract and export repo owner/name
run: |
REPO_SLUG="${GITHUB_REPOSITORY}"
IFS="/" read -r REPO_OWNER REPO_NAME <<< "$REPO_SLUG"
echo "REPO_OWNER=$REPO_OWNER" >> $GITHUB_ENV
echo "REPO_NAME=$REPO_NAME" >> $GITHUB_ENV
- name: SonarQube scan
uses: SonarSource/sonarqube-scan-action@v5
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets[format('{0}', vars.SONAR_TOKEN_SECRET_NAME)] }}
with:
args: >
-Dsonar.organization=${{ env.REPO_OWNER }}
-Dsonar.projectKey=${{ env.REPO_OWNER }}_${{ env.REPO_NAME }}
-Dsonar.pullrequest.key=${{ env.PR_NUMBER }}
-Dsonar.pullrequest.branch=${{ env.PR_HEAD }}
-Dsonar.pullrequest.base=${{ env.PR_BASE }}
-Dsonar.scm.revision=${{ github.event.workflow_run.head_sha }}

View File

@@ -38,7 +38,7 @@ jobs:
--workdir=/awx_devel `make print-DEVEL_IMAGE_NAME` /start_tests.sh genschema
- name: Upload API Schema
uses: keithweaver/aws-s3-github-action@v1.0.0
uses: keithweaver/aws-s3-github-action@4dd5a7b81d54abaa23bbac92b27e85d7f405ae53
with:
command: cp
source: ${{ github.workspace }}/schema.json
@@ -46,3 +46,4 @@ jobs:
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_KEY }}
aws_region: us-east-1
flags: --acl public-read --only-show-errors

View File

@@ -3,7 +3,7 @@
from django.urls import re_path
from awx.api.views import RoleList, RoleDetail, RoleUsersList, RoleTeamsList, RoleParentsList, RoleChildrenList
from awx.api.views import RoleList, RoleDetail, RoleUsersList, RoleTeamsList
urls = [
@@ -11,8 +11,6 @@ urls = [
re_path(r'^(?P<pk>[0-9]+)/$', RoleDetail.as_view(), name='role_detail'),
re_path(r'^(?P<pk>[0-9]+)/users/$', RoleUsersList.as_view(), name='role_users_list'),
re_path(r'^(?P<pk>[0-9]+)/teams/$', RoleTeamsList.as_view(), name='role_teams_list'),
re_path(r'^(?P<pk>[0-9]+)/parents/$', RoleParentsList.as_view(), name='role_parents_list'),
re_path(r'^(?P<pk>[0-9]+)/children/$', RoleChildrenList.as_view(), name='role_children_list'),
]
__all__ = ['urls']

View File

@@ -720,9 +720,19 @@ class TeamRolesList(SubListAttachDetachAPIView):
team = get_object_or_404(models.Team, pk=self.kwargs['pk'])
credential_content_type = ContentType.objects.get_for_model(models.Credential)
if role.content_type == credential_content_type:
if not role.content_object.organization or role.content_object.organization.id != team.organization.id:
data = dict(msg=_("You cannot grant credential access to a team when the Organization field isn't set, or belongs to a different organization"))
if not role.content_object.organization:
data = dict(
msg=_("You cannot grant access to a credential that is not assigned to an organization (private credentials cannot be assigned to teams)")
)
return Response(data, status=status.HTTP_400_BAD_REQUEST)
elif role.content_object.organization.id != team.organization.id:
if not request.user.is_superuser:
data = dict(
msg=_(
"You cannot grant a team access to a credential in a different organization. Only superusers can grant cross-organization credential access to teams"
)
)
return Response(data, status=status.HTTP_400_BAD_REQUEST)
return super(TeamRolesList, self).post(request, *args, **kwargs)
@@ -4203,9 +4213,21 @@ class RoleTeamsList(SubListAttachDetachAPIView):
credential_content_type = ContentType.objects.get_for_model(models.Credential)
if role.content_type == credential_content_type:
if not role.content_object.organization or role.content_object.organization.id != team.organization.id:
data = dict(msg=_("You cannot grant credential access to a team when the Organization field isn't set, or belongs to a different organization"))
# Private credentials (no organization) are never allowed for teams
if not role.content_object.organization:
data = dict(
msg=_("You cannot grant access to a credential that is not assigned to an organization (private credentials cannot be assigned to teams)")
)
return Response(data, status=status.HTTP_400_BAD_REQUEST)
# Cross-organization credentials are only allowed for superusers
elif role.content_object.organization.id != team.organization.id:
if not request.user.is_superuser:
data = dict(
msg=_(
"You cannot grant a team access to a credential in a different organization. Only superusers can grant cross-organization credential access to teams"
)
)
return Response(data, status=status.HTTP_400_BAD_REQUEST)
action = 'attach'
if request.data.get('disassociate', None):
@@ -4225,34 +4247,6 @@ class RoleTeamsList(SubListAttachDetachAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
class RoleParentsList(SubListAPIView):
deprecated = True
model = models.Role
serializer_class = serializers.RoleSerializer
parent_model = models.Role
relationship = 'parents'
permission_classes = (IsAuthenticated,)
search_fields = ('role_field', 'content_type__model')
def get_queryset(self):
role = models.Role.objects.get(pk=self.kwargs['pk'])
return models.Role.filter_visible_roles(self.request.user, role.parents.all())
class RoleChildrenList(SubListAPIView):
deprecated = True
model = models.Role
serializer_class = serializers.RoleSerializer
parent_model = models.Role
relationship = 'children'
permission_classes = (IsAuthenticated,)
search_fields = ('role_field', 'content_type__model')
def get_queryset(self):
role = models.Role.objects.get(pk=self.kwargs['pk'])
return models.Role.filter_visible_roles(self.request.user, role.children.all())
# Create view functions for all of the class-based views to simplify inclusion
# in URL patterns and reverse URL lookups, converting CamelCase names to
# lowercase_with_underscore (e.g. MyView.as_view() becomes my_view).

View File

@@ -180,16 +180,47 @@ class ApiV2SubscriptionView(APIView):
def post(self, request):
data = request.data.copy()
if data.get('subscriptions_client_secret') == '$encrypted$':
data['subscriptions_client_secret'] = settings.SUBSCRIPTIONS_CLIENT_SECRET
try:
user, pw = data.get('subscriptions_client_id'), data.get('subscriptions_client_secret')
user = None
pw = None
basic_auth = False
# determine if the credentials are for basic auth or not
if data.get('subscriptions_client_id'):
user, pw = data.get('subscriptions_client_id'), data.get('subscriptions_client_secret')
if pw == '$encrypted$':
pw = settings.SUBSCRIPTIONS_CLIENT_SECRET
elif data.get('subscriptions_username'):
user, pw = data.get('subscriptions_username'), data.get('subscriptions_password')
if pw == '$encrypted$':
pw = settings.SUBSCRIPTIONS_PASSWORD
basic_auth = True
if not user or not pw:
return Response({"error": _("Missing subscription credentials")}, status=status.HTTP_400_BAD_REQUEST)
with set_environ(**settings.AWX_TASK_ENV):
validated = get_licenser().validate_rh(user, pw)
if user:
settings.SUBSCRIPTIONS_CLIENT_ID = data['subscriptions_client_id']
if pw:
settings.SUBSCRIPTIONS_CLIENT_SECRET = data['subscriptions_client_secret']
validated = get_licenser().validate_rh(user, pw, basic_auth)
# update settings if the credentials were valid
if basic_auth:
if user:
settings.SUBSCRIPTIONS_USERNAME = user
if pw:
settings.SUBSCRIPTIONS_PASSWORD = pw
# mutual exclusion for basic auth and service account
# only one should be set at a given time so that
# config/attach/ knows which credentials to use
settings.SUBSCRIPTIONS_CLIENT_ID = ""
settings.SUBSCRIPTIONS_CLIENT_SECRET = ""
else:
if user:
settings.SUBSCRIPTIONS_CLIENT_ID = user
if pw:
settings.SUBSCRIPTIONS_CLIENT_SECRET = pw
# mutual exclusion for basic auth and service account
settings.SUBSCRIPTIONS_USERNAME = ""
settings.SUBSCRIPTIONS_PASSWORD = ""
except Exception as exc:
msg = _("Invalid Subscription")
if isinstance(exc, TokenError) or (
@@ -225,16 +256,21 @@ class ApiV2AttachView(APIView):
if not subscription_id:
return Response({"error": _("No subscription ID provided.")}, status=status.HTTP_400_BAD_REQUEST)
# Ensure we always use the latest subscription credentials
cache.delete_many(['SUBSCRIPTIONS_CLIENT_ID', 'SUBSCRIPTIONS_CLIENT_SECRET'])
cache.delete_many(['SUBSCRIPTIONS_CLIENT_ID', 'SUBSCRIPTIONS_CLIENT_SECRET', 'SUBSCRIPTIONS_USERNAME', 'SUBSCRIPTIONS_PASSWORD'])
user = getattr(settings, 'SUBSCRIPTIONS_CLIENT_ID', None)
pw = getattr(settings, 'SUBSCRIPTIONS_CLIENT_SECRET', None)
basic_auth = False
if not (user and pw):
user = getattr(settings, 'SUBSCRIPTIONS_USERNAME', None)
pw = getattr(settings, 'SUBSCRIPTIONS_PASSWORD', None)
basic_auth = True
if not (user and pw):
return Response({"error": _("Missing subscription credentials")}, status=status.HTTP_400_BAD_REQUEST)
if subscription_id and user and pw:
data = request.data.copy()
try:
with set_environ(**settings.AWX_TASK_ENV):
validated = get_licenser().validate_rh(user, pw)
validated = get_licenser().validate_rh(user, pw, basic_auth)
except Exception as exc:
msg = _("Invalid Subscription")
if isinstance(exc, requests.exceptions.HTTPError) and getattr(getattr(exc, 'response', None), 'status_code', None) == 401:
@@ -248,6 +284,7 @@ class ApiV2AttachView(APIView):
else:
logger.exception(smart_str(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
return Response({"error": msg}, status=status.HTTP_400_BAD_REQUEST)
for sub in validated:
if sub['subscription_id'] == subscription_id:
sub['valid_key'] = True

View File

@@ -44,11 +44,12 @@ class MetricsServer(MetricsServerSettings):
class BaseM:
def __init__(self, field, help_text):
def __init__(self, field, help_text, labels=None):
self.field = field
self.help_text = help_text
self.current_value = 0
self.metric_has_changed = False
self.labels = labels or {}
def reset_value(self, conn):
conn.hset(root_key, self.field, 0)
@@ -69,12 +70,16 @@ class BaseM:
value = conn.hget(root_key, self.field)
return self.decode_value(value)
def to_prometheus(self, instance_data):
def to_prometheus(self, instance_data, namespace=None):
output_text = f"# HELP {self.field} {self.help_text}\n# TYPE {self.field} gauge\n"
for instance in instance_data:
if self.field in instance_data[instance]:
# Build label string
labels = f'node="{instance}"'
if namespace:
labels += f',subsystem="{namespace}"'
# on upgrade, if there are stale instances, we can end up with issues where new metrics are not present
output_text += f'{self.field}{{node="{instance}"}} {instance_data[instance][self.field]}\n'
output_text += f'{self.field}{{{labels}}} {instance_data[instance][self.field]}\n'
return output_text
@@ -167,14 +172,17 @@ class HistogramM(BaseM):
self.sum.store_value(conn)
self.inf.store_value(conn)
def to_prometheus(self, instance_data):
def to_prometheus(self, instance_data, namespace=None):
output_text = f"# HELP {self.field} {self.help_text}\n# TYPE {self.field} histogram\n"
for instance in instance_data:
# Build label string
node_label = f'node="{instance}"'
subsystem_label = f',subsystem="{namespace}"' if namespace else ''
for i, b in enumerate(self.buckets):
output_text += f'{self.field}_bucket{{le="{b}",node="{instance}"}} {sum(instance_data[instance][self.field]["counts"][0:i+1])}\n'
output_text += f'{self.field}_bucket{{le="+Inf",node="{instance}"}} {instance_data[instance][self.field]["inf"]}\n'
output_text += f'{self.field}_count{{node="{instance}"}} {instance_data[instance][self.field]["inf"]}\n'
output_text += f'{self.field}_sum{{node="{instance}"}} {instance_data[instance][self.field]["sum"]}\n'
output_text += f'{self.field}_bucket{{le="{b}",{node_label}{subsystem_label}}} {sum(instance_data[instance][self.field]["counts"][0:i+1])}\n'
output_text += f'{self.field}_bucket{{le="+Inf",{node_label}{subsystem_label}}} {instance_data[instance][self.field]["inf"]}\n'
output_text += f'{self.field}_count{{{node_label}{subsystem_label}}} {instance_data[instance][self.field]["inf"]}\n'
output_text += f'{self.field}_sum{{{node_label}{subsystem_label}}} {instance_data[instance][self.field]["sum"]}\n'
return output_text
@@ -273,20 +281,22 @@ class Metrics(MetricsNamespace):
def pipe_execute(self):
if self.metrics_have_changed is True:
duration_to_save = time.perf_counter()
duration_pipe_exec = time.perf_counter()
for m in self.METRICS:
self.METRICS[m].store_value(self.pipe)
self.pipe.execute()
self.last_pipe_execute = time.time()
self.metrics_have_changed = False
duration_to_save = time.perf_counter() - duration_to_save
self.METRICS['subsystem_metrics_pipe_execute_seconds'].inc(duration_to_save)
self.METRICS['subsystem_metrics_pipe_execute_calls'].inc(1)
duration_pipe_exec = time.perf_counter() - duration_pipe_exec
duration_to_save = time.perf_counter()
duration_send_metrics = time.perf_counter()
self.send_metrics()
duration_to_save = time.perf_counter() - duration_to_save
self.METRICS['subsystem_metrics_send_metrics_seconds'].inc(duration_to_save)
duration_send_metrics = time.perf_counter() - duration_send_metrics
# Increment operational metrics
self.METRICS['subsystem_metrics_pipe_execute_seconds'].inc(duration_pipe_exec)
self.METRICS['subsystem_metrics_pipe_execute_calls'].inc(1)
self.METRICS['subsystem_metrics_send_metrics_seconds'].inc(duration_send_metrics)
def send_metrics(self):
# more than one thread could be calling this at the same time, so should
@@ -352,7 +362,13 @@ class Metrics(MetricsNamespace):
if instance_data:
for field in self.METRICS:
if len(metrics_filter) == 0 or field in metrics_filter:
output_text += self.METRICS[field].to_prometheus(instance_data)
# Add subsystem label only for operational metrics
namespace = (
self._namespace
if field in ['subsystem_metrics_pipe_execute_seconds', 'subsystem_metrics_pipe_execute_calls', 'subsystem_metrics_send_metrics_seconds']
else None
)
output_text += self.METRICS[field].to_prometheus(instance_data, namespace)
return output_text
@@ -440,7 +456,10 @@ class CustomToPrometheusMetricsCollector(prometheus_client.registry.Collector):
logger.debug(f"No metric data not found in redis for metric namespace '{self._metrics._namespace}'")
return None
host_metrics = instance_data.get(my_hostname)
if not (host_metrics := instance_data.get(my_hostname)):
logger.debug(f"Metric data for this node '{my_hostname}' not found in redis for metric namespace '{self._metrics._namespace}'")
return None
for _, metric in self._metrics.METRICS.items():
entry = host_metrics.get(metric.field)
if not entry:

View File

@@ -144,6 +144,35 @@ register(
category_slug='system',
)
register(
'SUBSCRIPTIONS_USERNAME',
field_class=fields.CharField,
default='',
allow_blank=True,
encrypted=False,
read_only=False,
label=_('Red Hat Username for Subscriptions'),
help_text=_('Username used to retrieve subscription and content information'), # noqa
category=_('System'),
category_slug='system',
hidden=True,
)
register(
'SUBSCRIPTIONS_PASSWORD',
field_class=fields.CharField,
default='',
allow_blank=True,
encrypted=True,
read_only=False,
label=_('Red Hat Password for Subscriptions'),
help_text=_('Password used to retrieve subscription and content information'), # noqa
category=_('System'),
category_slug='system',
hidden=True,
)
register(
'SUBSCRIPTIONS_CLIENT_ID',
field_class=fields.CharField,
@@ -155,6 +184,7 @@ register(
help_text=_('Client ID used to retrieve subscription and content information'), # noqa
category=_('System'),
category_slug='system',
hidden=True,
)
register(
@@ -168,6 +198,7 @@ register(
help_text=_('Client secret used to retrieve subscription and content information'), # noqa
category=_('System'),
category_slug='system',
hidden=True,
)
register(

View File

@@ -14,21 +14,14 @@ from jinja2.exceptions import UndefinedError, TemplateSyntaxError, SecurityError
# Django
from django.core import exceptions as django_exceptions
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models.signals import (
post_save,
post_delete,
)
from django.db.models.signals import m2m_changed
from django.db.models.signals import m2m_changed, post_save
from django.db import models
from django.db.models.fields.related import lazy_related_operation
from django.db.models.fields.related_descriptors import (
ReverseOneToOneDescriptor,
ForwardManyToOneDescriptor,
ManyToManyDescriptor,
ReverseManyToOneDescriptor,
create_forward_many_to_many_manager,
)
from django.utils.encoding import smart_str
from django.db.models import JSONField
from django.utils.functional import cached_property
from django.utils.translation import gettext_lazy as _
@@ -54,7 +47,6 @@ __all__ = [
'ImplicitRoleField',
'SmartFilterField',
'OrderedManyToManyField',
'update_role_parentage_for_instance',
'is_implicit_parent',
]
@@ -146,34 +138,6 @@ class AutoOneToOneField(models.OneToOneField):
setattr(cls, related.get_accessor_name(), AutoSingleRelatedObjectDescriptor(related))
def resolve_role_field(obj, field):
ret = []
field_components = field.split('.', 1)
if hasattr(obj, field_components[0]):
obj = getattr(obj, field_components[0])
else:
return []
if obj is None:
return []
if len(field_components) == 1:
# use extremely generous duck typing to accomidate all possible forms
# of the model that may be used during various migrations
if obj._meta.model_name != 'role' or obj._meta.app_label != 'main':
raise Exception(smart_str('{} refers to a {}, not a Role'.format(field, type(obj))))
ret.append(obj.id)
else:
if type(obj) is ManyToManyDescriptor:
for o in obj.all():
ret += resolve_role_field(o, field_components[1])
else:
ret += resolve_role_field(obj, field_components[1])
return ret
def is_implicit_parent(parent_role, child_role):
"""
Determine if the parent_role is an implicit parent as defined by
@@ -210,34 +174,6 @@ def is_implicit_parent(parent_role, child_role):
return False
def update_role_parentage_for_instance(instance):
"""update_role_parentage_for_instance
updates the parents listing for all the roles
of a given instance if they have changed
"""
parents_removed = set()
parents_added = set()
for implicit_role_field in getattr(instance.__class__, '__implicit_role_fields'):
cur_role = getattr(instance, implicit_role_field.name)
original_parents = set(json.loads(cur_role.implicit_parents))
new_parents = implicit_role_field._resolve_parent_roles(instance)
removals = original_parents - new_parents
if removals:
cur_role.parents.remove(*list(removals))
parents_removed.add(cur_role.pk)
additions = new_parents - original_parents
if additions:
cur_role.parents.add(*list(additions))
parents_added.add(cur_role.pk)
new_parents_list = list(new_parents)
new_parents_list.sort()
new_parents_json = json.dumps(new_parents_list)
if cur_role.implicit_parents != new_parents_json:
cur_role.implicit_parents = new_parents_json
cur_role.save(update_fields=['implicit_parents'])
return (parents_added, parents_removed)
class ImplicitRoleDescriptor(ForwardManyToOneDescriptor):
pass
@@ -269,65 +205,6 @@ class ImplicitRoleField(models.ForeignKey):
getattr(cls, '__implicit_role_fields').append(self)
post_save.connect(self._post_save, cls, True, dispatch_uid='implicit-role-post-save')
post_delete.connect(self._post_delete, cls, True, dispatch_uid='implicit-role-post-delete')
function = lambda local, related, field: self.bind_m2m_changed(field, related, local)
lazy_related_operation(function, cls, "self", field=self)
def bind_m2m_changed(self, _self, _role_class, cls):
if not self.parent_role:
return
field_names = self.parent_role
if type(field_names) is not list:
field_names = [field_names]
for field_name in field_names:
if field_name.startswith('singleton:'):
continue
field_name, sep, field_attr = field_name.partition('.')
# Non existent fields will occur if ever a parent model is
# moved inside a migration, needed for job_template_organization_field
# migration in particular
# consistency is assured by unit test awx.main.tests.functional
field = getattr(cls, field_name, None)
if field and type(field) is ReverseManyToOneDescriptor or type(field) is ManyToManyDescriptor:
if '.' in field_attr:
raise Exception('Referencing deep roles through ManyToMany fields is unsupported.')
if type(field) is ReverseManyToOneDescriptor:
sender = field.through
else:
sender = field.related.through
reverse = type(field) is ManyToManyDescriptor
m2m_changed.connect(self.m2m_update(field_attr, reverse), sender, weak=False)
def m2m_update(self, field_attr, _reverse):
def _m2m_update(instance, action, model, pk_set, reverse, **kwargs):
if action == 'post_add' or action == 'pre_remove':
if _reverse:
reverse = not reverse
if reverse:
for pk in pk_set:
obj = model.objects.get(pk=pk)
if action == 'post_add':
getattr(instance, field_attr).children.add(getattr(obj, self.name))
if action == 'pre_remove':
getattr(instance, field_attr).children.remove(getattr(obj, self.name))
else:
for pk in pk_set:
obj = model.objects.get(pk=pk)
if action == 'post_add':
getattr(instance, self.name).parents.add(getattr(obj, field_attr))
if action == 'pre_remove':
getattr(instance, self.name).parents.remove(getattr(obj, field_attr))
return _m2m_update
def _post_save(self, instance, created, *args, **kwargs):
Role_ = utils.get_current_apps().get_model('main', 'Role')
@@ -337,68 +214,24 @@ class ImplicitRoleField(models.ForeignKey):
Model = utils.get_current_apps().get_model('main', instance.__class__.__name__)
latest_instance = Model.objects.get(pk=instance.pk)
# Avoid circular import
from awx.main.models.rbac import batch_role_ancestor_rebuilding, Role
# Create any missing role objects
missing_roles = []
for implicit_role_field in getattr(latest_instance.__class__, '__implicit_role_fields'):
cur_role = getattr(latest_instance, implicit_role_field.name, None)
if cur_role is None:
missing_roles.append(Role_(role_field=implicit_role_field.name, content_type_id=ct_id, object_id=latest_instance.id))
with batch_role_ancestor_rebuilding():
# Create any missing role objects
missing_roles = []
for implicit_role_field in getattr(latest_instance.__class__, '__implicit_role_fields'):
cur_role = getattr(latest_instance, implicit_role_field.name, None)
if cur_role is None:
missing_roles.append(Role_(role_field=implicit_role_field.name, content_type_id=ct_id, object_id=latest_instance.id))
if len(missing_roles) > 0:
Role_.objects.bulk_create(missing_roles)
updates = {}
role_ids = []
for role in Role_.objects.filter(content_type_id=ct_id, object_id=latest_instance.id):
setattr(latest_instance, role.role_field, role)
updates[role.role_field] = role.id
role_ids.append(role.id)
type(latest_instance).objects.filter(pk=latest_instance.pk).update(**updates)
if len(missing_roles) > 0:
Role_.objects.bulk_create(missing_roles)
updates = {}
role_ids = []
for role in Role_.objects.filter(content_type_id=ct_id, object_id=latest_instance.id):
setattr(latest_instance, role.role_field, role)
updates[role.role_field] = role.id
role_ids.append(role.id)
type(latest_instance).objects.filter(pk=latest_instance.pk).update(**updates)
Role.rebuild_role_ancestor_list(role_ids, [])
update_role_parentage_for_instance(latest_instance)
instance.refresh_from_db()
def _resolve_parent_roles(self, instance):
if not self.parent_role:
return set()
paths = self.parent_role if type(self.parent_role) is list else [self.parent_role]
parent_roles = set()
for path in paths:
if path.startswith("singleton:"):
singleton_name = path[10:]
Role_ = utils.get_current_apps().get_model('main', 'Role')
qs = Role_.objects.filter(singleton_name=singleton_name)
if qs.count() >= 1:
role = qs[0]
else:
role = Role_.objects.create(singleton_name=singleton_name, role_field=singleton_name)
parents = [role.id]
else:
parents = resolve_role_field(instance, path)
for parent in parents:
parent_roles.add(parent)
return parent_roles
def _post_delete(self, instance, *args, **kwargs):
role_ids = []
for implicit_role_field in getattr(instance.__class__, '__implicit_role_fields'):
role_ids.append(getattr(instance, implicit_role_field.name + '_id'))
Role_ = utils.get_current_apps().get_model('main', 'Role')
child_ids = [x for x in Role_.parents.through.objects.filter(to_role_id__in=role_ids).distinct().values_list('from_role_id', flat=True)]
Role_.objects.filter(id__in=role_ids).delete()
# Avoid circular import
from awx.main.models.rbac import Role
Role.rebuild_role_ancestor_list([], child_ids)
instance.refresh_from_db()
class SmartFilterField(models.TextField):

View File

@@ -38,13 +38,13 @@ class Migration(migrations.Migration):
]
operations = [
migrations.RunPython(rename_jts, migrations.RunPython.noop),
migrations.RunPython(rename_projects, migrations.RunPython.noop),
migrations.AddField(
model_name='unifiedjobtemplate',
name='org_unique',
field=models.BooleanField(blank=True, default=True, editable=False, help_text='Used internally to selectively enforce database constraint on name'),
),
migrations.RunPython(rename_jts, migrations.RunPython.noop),
migrations.RunPython(rename_projects, migrations.RunPython.noop),
migrations.RunPython(rename_wfjt, migrations.RunPython.noop),
migrations.RunPython(change_inventory_source_org_unique, migrations.RunPython.noop),
migrations.AddConstraint(

View File

@@ -10,6 +10,11 @@ def setup_tower_managed_defaults(apps, schema_editor):
CredentialType.setup_tower_managed_defaults(apps)
def setup_rbac_role_system_administrator(apps, schema_editor):
Role = apps.get_model('main', 'Role')
Role.objects.get_or_create(singleton_name='system_administrator', role_field='system_administrator')
class Migration(migrations.Migration):
dependencies = [
('main', '0200_template_name_constraint'),
@@ -17,4 +22,5 @@ class Migration(migrations.Migration):
operations = [
migrations.RunPython(setup_tower_managed_defaults),
migrations.RunPython(setup_rbac_role_system_administrator),
]

View File

@@ -3,7 +3,6 @@ from time import time
from django.db.models import Subquery, OuterRef, F
from awx.main.fields import update_role_parentage_for_instance
from awx.main.models.rbac import Role, batch_role_ancestor_rebuilding
logger = logging.getLogger('rbac_migrations')
@@ -238,85 +237,10 @@ def restore_inventory_admins_backward(apps, schema_editor):
def rebuild_role_hierarchy(apps, schema_editor):
"""
This should be called in any migration when ownerships are changed.
Ex. I remove a user from the admin_role of a credential.
Ancestors are cached from parents for performance, this re-computes ancestors.
"""
logger.info('Computing role roots..')
start = time()
roots = Role.objects.all().values_list('id', flat=True)
stop = time()
logger.info('Found %d roots in %f seconds, rebuilding ancestry map' % (len(roots), stop - start))
start = time()
Role.rebuild_role_ancestor_list(roots, [])
stop = time()
logger.info('Rebuild ancestors completed in %f seconds' % (stop - start))
logger.info('Done.')
"""Not used after DAB RBAC migration"""
pass
def rebuild_role_parentage(apps, schema_editor, models=None):
"""
This should be called in any migration when any parent_role entry
is modified so that the cached parent fields will be updated. Ex:
foo_role = ImplicitRoleField(
parent_role=['bar_role'] # change to parent_role=['admin_role']
)
This is like rebuild_role_hierarchy, but that method updates ancestors,
whereas this method updates parents.
"""
start = time()
seen_models = set()
model_ct = 0
noop_ct = 0
ContentType = apps.get_model('contenttypes', "ContentType")
additions = set()
removals = set()
role_qs = Role.objects
if models:
# update_role_parentage_for_instance is expensive
# if the models have been downselected, ignore those which are not in the list
ct_ids = list(ContentType.objects.filter(model__in=[name.lower() for name in models]).values_list('id', flat=True))
role_qs = role_qs.filter(content_type__in=ct_ids)
for role in role_qs.iterator():
if not role.object_id:
continue
model_tuple = (role.content_type_id, role.object_id)
if model_tuple in seen_models:
continue
seen_models.add(model_tuple)
# The GenericForeignKey does not work right in migrations
# with the usage as role.content_object
# so we do the lookup ourselves with current migration models
ct = role.content_type
app = ct.app_label
ct_model = apps.get_model(app, ct.model)
content_object = ct_model.objects.get(pk=role.object_id)
parents_added, parents_removed = update_role_parentage_for_instance(content_object)
additions.update(parents_added)
removals.update(parents_removed)
if parents_added:
model_ct += 1
logger.debug('Added to parents of roles {} of {}'.format(parents_added, content_object))
if parents_removed:
model_ct += 1
logger.debug('Removed from parents of roles {} of {}'.format(parents_removed, content_object))
else:
noop_ct += 1
logger.debug('No changes to role parents for {} resources'.format(noop_ct))
logger.debug('Added parents to {} roles'.format(len(additions)))
logger.debug('Removed parents from {} roles'.format(len(removals)))
if model_ct:
logger.info('Updated implicit parents of {} resources'.format(model_ct))
logger.info('Rebuild parentage completed in %f seconds' % (time() - start))
# this is ran because the ordinary signals for
# Role.parents.add and Role.parents.remove not called in migration
Role.rebuild_role_ancestor_list(list(additions), list(removals))
"""Not used after DAB RBAC migration"""
pass

View File

@@ -38,7 +38,6 @@ from awx.main.models import (
InventorySource,
Job,
JobHostSummary,
JobTemplate,
Organization,
Project,
Role,
@@ -56,10 +55,7 @@ from awx.main.models import (
from awx.main.utils import model_instance_diff, model_to_dict, camelcase_to_underscore, get_current_apps
from awx.main.utils import ignore_inventory_computed_fields, ignore_inventory_group_removal, _inventory_updates
from awx.main.tasks.system import update_inventory_computed_fields, handle_removed_image
from awx.main.fields import (
is_implicit_parent,
update_role_parentage_for_instance,
)
from awx.main.fields import is_implicit_parent
from awx.main import consumers
@@ -192,31 +188,6 @@ def cleanup_detached_labels_on_deleted_parent(sender, instance, **kwargs):
label.delete()
def save_related_job_templates(sender, instance, **kwargs):
"""save_related_job_templates loops through all of the
job templates that use an Inventory that have had their
Organization updated. This triggers the rebuilding of the RBAC hierarchy
and ensures the proper access restrictions.
"""
if sender is not Inventory:
raise ValueError('This signal callback is only intended for use with Project or Inventory')
update_fields = kwargs.get('update_fields', None)
if (update_fields and not ('organization' in update_fields or 'organization_id' in update_fields)) or kwargs.get('created', False):
return
if instance._prior_values_store.get('organization_id') != instance.organization_id:
jtq = JobTemplate.objects.filter(**{sender.__name__.lower(): instance})
for jt in jtq:
parents_added, parents_removed = update_role_parentage_for_instance(jt)
if parents_added or parents_removed:
logger.info(
'Permissions on JT {} changed due to inventory {} organization change from {} to {}.'.format(
jt.pk, instance.pk, instance._prior_values_store.get('organization_id'), instance.organization_id
)
)
def connect_computed_field_signals():
post_save.connect(emit_update_inventory_on_created_or_deleted, sender=Host)
post_delete.connect(emit_update_inventory_on_created_or_deleted, sender=Host)
@@ -230,7 +201,6 @@ def connect_computed_field_signals():
connect_computed_field_signals()
post_save.connect(save_related_job_templates, sender=Inventory)
m2m_changed.connect(rebuild_role_ancestor_list, Role.parents.through)
m2m_changed.connect(rbac_activity_stream, Role.members.through)
m2m_changed.connect(rbac_activity_stream, Role.parents.through)

View File

@@ -1321,7 +1321,7 @@ class RunProjectUpdate(BaseTask):
galaxy_creds_are_defined = project_update.project.organization and project_update.project.organization.galaxy_credentials.exists()
if not galaxy_creds_are_defined and (settings.AWX_ROLES_ENABLED or settings.AWX_COLLECTIONS_ENABLED):
logger.warning('Galaxy role/collection syncing is enabled, but no credentials are configured for {project_update.project.organization}.')
logger.warning(f'Galaxy role/collection syncing is enabled, but no credentials are configured for {project_update.project.organization}.')
extra_vars.update(
{

View File

@@ -49,7 +49,7 @@ def test_metrics_counts(organization_factory, job_template_factory, workflow_job
for gauge in gauges:
for sample in gauge.samples:
# name, label, value, timestamp, exemplar
name, _, value, _, _ = sample
name, _, value, _, _, _ = sample
assert EXPECTED_VALUES[name] == value

View File

@@ -287,6 +287,72 @@ def test_sa_grant_private_credential_to_team_through_role_teams(post, credential
assert response.status_code == 400
@pytest.mark.django_db
def test_grant_credential_to_team_different_organization_through_role_teams(post, get, credential, organizations, admin, org_admin, team, team_member):
# # Test that credential from different org can be assigned to team by a superuser through role_teams_list endpoint
orgs = organizations(2)
credential.organization = orgs[0]
credential.save()
team.organization = orgs[1]
team.save()
# Non-superuser (org_admin) trying cross-org assignment should be denied
response = post(reverse('api:role_teams_list', kwargs={'pk': credential.use_role.id}), {'id': team.id}, org_admin)
assert response.status_code == 400
assert (
"You cannot grant a team access to a credential in a different organization. Only superusers can grant cross-organization credential access to teams"
in response.data['msg']
)
# Superuser (admin) can do cross-org assignment
response = post(reverse('api:role_teams_list', kwargs={'pk': credential.use_role.id}), {'id': team.id}, admin)
assert response.status_code == 204
assert credential.use_role in team.member_role.children.all()
assert team_member in credential.read_role
assert team_member in credential.use_role
assert team_member not in credential.admin_role
@pytest.mark.django_db
def test_grant_credential_to_team_different_organization(post, get, credential, organizations, admin, org_admin, team, team_member):
# Test that credential from different org can be assigned to team by a superuser
orgs = organizations(2)
credential.organization = orgs[0]
credential.save()
team.organization = orgs[1]
team.save()
# Non-superuser (org_admin, ...) trying cross-org assignment should be denied
response = post(reverse('api:team_roles_list', kwargs={'pk': team.id}), {'id': credential.use_role.id}, org_admin)
assert response.status_code == 400
assert (
"You cannot grant a team access to a credential in a different organization. Only superusers can grant cross-organization credential access to teams"
in response.data['msg']
)
# Superuser (system admin) can do cross-org assignment
response = post(reverse('api:team_roles_list', kwargs={'pk': team.id}), {'id': credential.use_role.id}, admin)
assert response.status_code == 204
assert credential.use_role in team.member_role.children.all()
assert team_member in credential.read_role
assert team_member in credential.use_role
assert team_member not in credential.admin_role
# Team member can see the credential in API
response = get(reverse('api:team_credentials_list', kwargs={'pk': team.id}), team_member)
assert response.status_code == 200
assert response.data['count'] == 1
assert response.data['results'][0]['id'] == credential.id
# Team member can see the credential in general credentials API
response = get(reverse('api:credential_list'), team_member)
assert response.status_code == 200
assert any(cred['id'] == credential.id for cred in response.data['results'])
@pytest.mark.django_db
def test_sa_grant_private_credential_to_team_through_team_roles(post, credential, admin, team):
# not even a system admin can grant a private cred to a team though

View File

@@ -0,0 +1,244 @@
from unittest.mock import patch, MagicMock
import pytest
from awx.api.versioning import reverse
from rest_framework import status
@pytest.mark.django_db
class TestApiV2SubscriptionView:
"""Test cases for the /api/v2/config/subscriptions/ endpoint"""
def test_basic_auth(self, post, admin):
"""Test POST with subscriptions_username and subscriptions_password calls validate_rh with basic_auth=True"""
data = {'subscriptions_username': 'test_user', 'subscriptions_password': 'test_password'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
mock_licenser.validate_rh.assert_called_once_with('test_user', 'test_password', True)
def test_service_account(self, post, admin):
"""Test POST with subscriptions_client_id and subscriptions_client_secret calls validate_rh with basic_auth=False"""
data = {'subscriptions_client_id': 'test_client_id', 'subscriptions_client_secret': 'test_client_secret'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
mock_licenser.validate_rh.assert_called_once_with('test_client_id', 'test_client_secret', False)
def test_encrypted_password_basic_auth(self, post, admin, settings):
"""Test POST with $encrypted$ password uses settings value for basic auth"""
data = {'subscriptions_username': 'test_user', 'subscriptions_password': '$encrypted$'}
settings.SUBSCRIPTIONS_PASSWORD = 'actual_password_from_settings'
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
mock_licenser.validate_rh.assert_called_once_with('test_user', 'actual_password_from_settings', True)
def test_encrypted_client_secret_service_account(self, post, admin, settings):
"""Test POST with $encrypted$ client_secret uses settings value for service_account"""
data = {'subscriptions_client_id': 'test_client_id', 'subscriptions_client_secret': '$encrypted$'}
settings.SUBSCRIPTIONS_CLIENT_SECRET = 'actual_secret_from_settings'
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
mock_licenser.validate_rh.assert_called_once_with('test_client_id', 'actual_secret_from_settings', False)
def test_missing_username_returns_error(self, post, admin):
"""Test POST with missing username returns 400 error"""
data = {'subscriptions_password': 'test_password'}
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert 'Missing subscription credentials' in response.data['error']
def test_missing_password_returns_error(self, post, admin, settings):
"""Test POST with missing password returns 400 error"""
data = {'subscriptions_username': 'test_user'}
settings.SUBSCRIPTIONS_PASSWORD = None
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert 'Missing subscription credentials' in response.data['error']
def test_missing_client_id_returns_error(self, post, admin):
"""Test POST with missing client_id returns 400 error"""
data = {'subscriptions_client_secret': 'test_secret'}
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert 'Missing subscription credentials' in response.data['error']
def test_missing_client_secret_returns_error(self, post, admin, settings):
"""Test POST with missing client_secret returns 400 error"""
data = {'subscriptions_client_id': 'test_client_id'}
settings.SUBSCRIPTIONS_CLIENT_SECRET = None
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert 'Missing subscription credentials' in response.data['error']
def test_empty_username_returns_error(self, post, admin):
"""Test POST with empty username returns 400 error"""
data = {'subscriptions_username': '', 'subscriptions_password': 'test_password'}
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert 'Missing subscription credentials' in response.data['error']
def test_empty_password_returns_error(self, post, admin, settings):
"""Test POST with empty password returns 400 error"""
data = {'subscriptions_username': 'test_user', 'subscriptions_password': ''}
settings.SUBSCRIPTIONS_PASSWORD = None
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert 'Missing subscription credentials' in response.data['error']
def test_non_superuser_permission_denied(self, post, rando):
"""Test that non-superuser cannot access the endpoint"""
data = {'subscriptions_username': 'test_user', 'subscriptions_password': 'test_password'}
response = post(reverse('api:api_v2_subscription_view'), data, rando)
assert response.status_code == status.HTTP_403_FORBIDDEN
def test_settings_updated_on_successful_basic_auth(self, post, admin, settings):
"""Test that settings are updated when basic auth validation succeeds"""
data = {'subscriptions_username': 'new_username', 'subscriptions_password': 'new_password'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
assert settings.SUBSCRIPTIONS_USERNAME == 'new_username'
assert settings.SUBSCRIPTIONS_PASSWORD == 'new_password'
def test_settings_updated_on_successful_service_account(self, post, admin, settings):
"""Test that settings are updated when service account validation succeeds"""
data = {'subscriptions_client_id': 'new_client_id', 'subscriptions_client_secret': 'new_client_secret'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
assert settings.SUBSCRIPTIONS_CLIENT_ID == 'new_client_id'
assert settings.SUBSCRIPTIONS_CLIENT_SECRET == 'new_client_secret'
def test_validate_rh_exception_handling(self, post, admin):
"""Test that exceptions from validate_rh are properly handled"""
data = {'subscriptions_username': 'test_user', 'subscriptions_password': 'test_password'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.side_effect = Exception("Connection error")
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_400_BAD_REQUEST
def test_mixed_credentials_prioritizes_client_id(self, post, admin):
"""Test that when both username and client_id are provided, client_id takes precedence"""
data = {
'subscriptions_username': 'test_user',
'subscriptions_password': 'test_password',
'subscriptions_client_id': 'test_client_id',
'subscriptions_client_secret': 'test_client_secret',
}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
# Should use service account (basic_auth=False) since client_id is present
mock_licenser.validate_rh.assert_called_once_with('test_client_id', 'test_client_secret', False)
def test_basic_auth_clears_service_account_settings(self, post, admin, settings):
"""Test that setting basic auth credentials clears service account settings"""
# Pre-populate service account settings
settings.SUBSCRIPTIONS_CLIENT_ID = 'existing_client_id'
settings.SUBSCRIPTIONS_CLIENT_SECRET = 'existing_client_secret'
data = {'subscriptions_username': 'test_user', 'subscriptions_password': 'test_password'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
# Basic auth settings should be set
assert settings.SUBSCRIPTIONS_USERNAME == 'test_user'
assert settings.SUBSCRIPTIONS_PASSWORD == 'test_password'
# Service account settings should be cleared
assert settings.SUBSCRIPTIONS_CLIENT_ID == ""
assert settings.SUBSCRIPTIONS_CLIENT_SECRET == ""
def test_service_account_clears_basic_auth_settings(self, post, admin, settings):
"""Test that setting service account credentials clears basic auth settings"""
# Pre-populate basic auth settings
settings.SUBSCRIPTIONS_USERNAME = 'existing_username'
settings.SUBSCRIPTIONS_PASSWORD = 'existing_password'
data = {'subscriptions_client_id': 'test_client_id', 'subscriptions_client_secret': 'test_client_secret'}
with patch('awx.api.views.root.get_licenser') as mock_get_licenser:
mock_licenser = MagicMock()
mock_licenser.validate_rh.return_value = []
mock_get_licenser.return_value = mock_licenser
response = post(reverse('api:api_v2_subscription_view'), data, admin)
assert response.status_code == status.HTTP_200_OK
# Service account settings should be set
assert settings.SUBSCRIPTIONS_CLIENT_ID == 'test_client_id'
assert settings.SUBSCRIPTIONS_CLIENT_SECRET == 'test_client_secret'
# Basic auth settings should be cleared
assert settings.SUBSCRIPTIONS_USERNAME == ""
assert settings.SUBSCRIPTIONS_PASSWORD == ""

View File

@@ -387,36 +387,6 @@ def test_remove_team_from_role(post, team, admin, role):
assert role.parents.filter(id=team.member_role.id).count() == 0
#
# /roles/<id>/parents/
#
@pytest.mark.django_db
def test_role_parents(get, team, admin, role):
role.parents.add(team.member_role)
url = reverse('api:role_parents_list', kwargs={'pk': role.id})
response = get(url, admin)
assert response.status_code == 200
assert response.data['count'] == 1
assert response.data['results'][0]['id'] == team.member_role.id
#
# /roles/<id>/children/
#
@pytest.mark.django_db
def test_role_children(get, team, admin, role):
role.parents.add(team.member_role)
url = reverse('api:role_children_list', kwargs={'pk': team.member_role.id})
response = get(url, admin)
assert response.status_code == 200
assert response.data['count'] == 2
assert response.data['results'][0]['id'] == role.id or response.data['results'][1]['id'] == role.id
#
# Generics
#

View File

@@ -167,3 +167,9 @@ class TestMigrationSmoke:
assert CredentialType.objects.filter(
name=expected_name
).exists(), f'Could not find {expected_name} credential type name, all names: {list(CredentialType.objects.values_list("name", flat=True))}'
# Verify the system_administrator role exists
Role = new_state.apps.get_model('main', 'Role')
assert Role.objects.filter(
singleton_name='system_administrator', role_field='system_administrator'
).exists(), "expected to find a system_administrator singleton role"

View File

@@ -1,37 +0,0 @@
import json
from http import HTTPStatus
from unittest.mock import patch
from requests import Response
from awx.main.utils.licensing import Licenser
def test_rhsm_licensing():
def mocked_requests_get(*args, **kwargs):
assert kwargs['verify'] == True
response = Response()
subs = json.dumps({'body': []})
response.status_code = HTTPStatus.OK
response._content = bytes(subs, 'utf-8')
return response
licenser = Licenser()
with patch('awx.main.utils.analytics_proxy.OIDCClient.make_request', new=mocked_requests_get):
subs = licenser.get_rhsm_subs('localhost', 'admin', 'admin')
assert subs == []
def test_satellite_licensing():
def mocked_requests_get(*args, **kwargs):
assert kwargs['verify'] == True
response = Response()
subs = json.dumps({'results': []})
response.status_code = HTTPStatus.OK
response._content = bytes(subs, 'utf-8')
return response
licenser = Licenser()
with patch('requests.get', new=mocked_requests_get):
subs = licenser.get_satellite_subs('localhost', 'admin', 'admin')
assert subs == []

View File

@@ -0,0 +1,154 @@
from unittest.mock import patch
from awx.main.utils.licensing import Licenser
def test_validate_rh_basic_auth_rhsm():
"""
Assert get_rhsm_subs is called when
- basic_auth=True
- host is subscription.rhsm.redhat.com
"""
licenser = Licenser()
with patch.object(licenser, 'get_host_from_rhsm_config', return_value='https://subscription.rhsm.redhat.com') as mock_get_host, patch.object(
licenser, 'get_rhsm_subs', return_value=[]
) as mock_get_rhsm, patch.object(licenser, 'get_satellite_subs') as mock_get_satellite, patch.object(
licenser, 'get_crc_subs'
) as mock_get_crc, patch.object(
licenser, 'generate_license_options_from_entitlements'
) as mock_generate:
licenser.validate_rh('testuser', 'testpass', basic_auth=True)
# Assert the correct methods were called
mock_get_host.assert_called_once()
mock_get_rhsm.assert_called_once_with('https://subscription.rhsm.redhat.com', 'testuser', 'testpass')
mock_get_satellite.assert_not_called()
mock_get_crc.assert_not_called()
mock_generate.assert_called_once_with([], is_candlepin=True)
def test_validate_rh_basic_auth_satellite():
"""
Assert get_satellite_subs is called when
- basic_auth=True
- custom satellite host
"""
licenser = Licenser()
with patch.object(licenser, 'get_host_from_rhsm_config', return_value='https://satellite.example.com') as mock_get_host, patch.object(
licenser, 'get_rhsm_subs'
) as mock_get_rhsm, patch.object(licenser, 'get_satellite_subs', return_value=[]) as mock_get_satellite, patch.object(
licenser, 'get_crc_subs'
) as mock_get_crc, patch.object(
licenser, 'generate_license_options_from_entitlements'
) as mock_generate:
licenser.validate_rh('testuser', 'testpass', basic_auth=True)
# Assert the correct methods were called
mock_get_host.assert_called_once()
mock_get_rhsm.assert_not_called()
mock_get_satellite.assert_called_once_with('https://satellite.example.com', 'testuser', 'testpass')
mock_get_crc.assert_not_called()
mock_generate.assert_called_once_with([], is_candlepin=True)
def test_validate_rh_service_account_crc():
"""
Assert get_crc_subs is called when
- basic_auth=False
"""
licenser = Licenser()
with patch('awx.main.utils.licensing.settings') as mock_settings, patch.object(licenser, 'get_host_from_rhsm_config') as mock_get_host, patch.object(
licenser, 'get_rhsm_subs'
) as mock_get_rhsm, patch.object(licenser, 'get_satellite_subs') as mock_get_satellite, patch.object(
licenser, 'get_crc_subs', return_value=[]
) as mock_get_crc, patch.object(
licenser, 'generate_license_options_from_entitlements'
) as mock_generate:
mock_settings.SUBSCRIPTIONS_RHSM_URL = 'https://console.redhat.com/api/rhsm/v1/subscriptions'
licenser.validate_rh('client_id', 'client_secret', basic_auth=False)
# Assert the correct methods were called
mock_get_host.assert_not_called()
mock_get_rhsm.assert_not_called()
mock_get_satellite.assert_not_called()
mock_get_crc.assert_called_once_with('https://console.redhat.com/api/rhsm/v1/subscriptions', 'client_id', 'client_secret')
mock_generate.assert_called_once_with([], is_candlepin=False)
def test_validate_rh_missing_user_raises_error():
"""Test validate_rh raises ValueError when user is missing"""
licenser = Licenser()
with patch.object(licenser, 'get_host_from_rhsm_config', return_value='https://subscription.rhsm.redhat.com'):
try:
licenser.validate_rh(None, 'testpass', basic_auth=True)
assert False, "Expected ValueError to be raised"
except ValueError as e:
assert 'subscriptions_client_id or subscriptions_username is required' in str(e)
def test_validate_rh_missing_password_raises_error():
"""Test validate_rh raises ValueError when password is missing"""
licenser = Licenser()
with patch.object(licenser, 'get_host_from_rhsm_config', return_value='https://subscription.rhsm.redhat.com'):
try:
licenser.validate_rh('testuser', None, basic_auth=True)
assert False, "Expected ValueError to be raised"
except ValueError as e:
assert 'subscriptions_client_secret or subscriptions_password is required' in str(e)
def test_validate_rh_no_host_fallback_to_candlepin():
"""Test validate_rh falls back to REDHAT_CANDLEPIN_HOST when no host from config
- basic_auth=True
- no host from config
- REDHAT_CANDLEPIN_HOST is set
"""
licenser = Licenser()
with patch('awx.main.utils.licensing.settings') as mock_settings, patch.object(
licenser, 'get_host_from_rhsm_config', return_value=None
) as mock_get_host, patch.object(licenser, 'get_rhsm_subs', return_value=[]) as mock_get_rhsm, patch.object(
licenser, 'get_satellite_subs', return_value=[]
) as mock_get_satellite, patch.object(
licenser, 'get_crc_subs'
) as mock_get_crc, patch.object(
licenser, 'generate_license_options_from_entitlements'
) as mock_generate:
mock_settings.REDHAT_CANDLEPIN_HOST = 'https://candlepin.example.com'
licenser.validate_rh('testuser', 'testpass', basic_auth=True)
# Assert the correct methods were called
mock_get_host.assert_called_once()
mock_get_rhsm.assert_not_called()
mock_get_satellite.assert_called_once_with('https://candlepin.example.com', 'testuser', 'testpass')
mock_get_crc.assert_not_called()
mock_generate.assert_called_once_with([], is_candlepin=True)
def test_validate_rh_empty_credentials_basic_auth():
"""Test validate_rh with empty string credentials raises ValueError"""
licenser = Licenser()
with patch.object(licenser, 'get_host_from_rhsm_config', return_value='https://subscription.rhsm.redhat.com'):
# Test empty user
try:
licenser.validate_rh(None, 'testpass', basic_auth=True)
assert False, "Expected ValueError to be raised"
except ValueError as e:
assert 'subscriptions_client_id or subscriptions_username is required' in str(e)
# Test empty password
try:
licenser.validate_rh('testuser', None, basic_auth=True)
assert False, "Expected ValueError to be raised"
except ValueError as e:
assert 'subscriptions_client_secret or subscriptions_password is required' in str(e)

View File

@@ -219,30 +219,65 @@ class Licenser(object):
kwargs['license_date'] = int(kwargs['license_date'])
self._attrs.update(kwargs)
def validate_rh(self, user, pw):
def get_host_from_rhsm_config(self):
try:
host = 'https://' + str(self.config.get("server", "hostname"))
except Exception:
logger.exception('Cannot access rhsm.conf, make sure subscription manager is installed and configured.')
host = None
return host
def validate_rh(self, user, pw, basic_auth):
# if basic auth is True, host is read from rhsm.conf (subscription.rhsm.redhat.com)
# if basic auth is False, host is settings.SUBSCRIPTIONS_RHSM_URL (console.redhat.com)
# if rhsm.conf is not found, host is settings.REDHAT_CANDLEPIN_HOST (satellite server)
if basic_auth:
host = self.get_host_from_rhsm_config()
if not host:
host = getattr(settings, 'REDHAT_CANDLEPIN_HOST', None)
else:
host = settings.SUBSCRIPTIONS_RHSM_URL
if not host:
host = getattr(settings, 'REDHAT_CANDLEPIN_HOST', None)
raise ValueError('Could not get host url for subscriptions')
if not user:
raise ValueError('subscriptions_client_id is required')
raise ValueError('subscriptions_client_id or subscriptions_username is required')
if not pw:
raise ValueError('subscriptions_client_secret is required')
raise ValueError('subscriptions_client_secret or subscriptions_password is required')
if host and user and pw:
if 'subscription.rhsm.redhat.com' in host:
json = self.get_rhsm_subs(settings.SUBSCRIPTIONS_RHSM_URL, user, pw)
if basic_auth:
if 'subscription.rhsm.redhat.com' in host:
json = self.get_rhsm_subs(host, user, pw)
else:
json = self.get_satellite_subs(host, user, pw)
else:
json = self.get_satellite_subs(host, user, pw)
return self.generate_license_options_from_entitlements(json)
json = self.get_crc_subs(host, user, pw)
return self.generate_license_options_from_entitlements(json, is_candlepin=basic_auth)
return []
def get_rhsm_subs(self, host, client_id, client_secret):
def get_rhsm_subs(self, host, user, pw):
verify = getattr(settings, 'REDHAT_CANDLEPIN_VERIFY', True)
json = []
try:
subs = requests.get('/'.join([host, 'subscription/users/{}/owners'.format(user)]), verify=verify, auth=(user, pw))
except requests.exceptions.ConnectionError as error:
raise error
except OSError as error:
raise OSError(
'Unable to open certificate bundle {}. Check that the service is running on Red Hat Enterprise Linux.'.format(verify)
) from error # noqa
subs.raise_for_status()
for sub in subs.json():
resp = requests.get('/'.join([host, 'subscription/owners/{}/pools/?match=*tower*'.format(sub['key'])]), verify=verify, auth=(user, pw))
resp.raise_for_status()
json.extend(resp.json())
return json
def get_crc_subs(self, host, client_id, client_secret):
try:
client = OIDCClient(client_id, client_secret)
subs = client.make_request(
@@ -320,12 +355,21 @@ class Licenser(object):
json.append(license)
return json
def is_appropriate_sub(self, sub):
if sub['activeSubscription'] is False:
return False
# Products that contain Ansible Tower
products = sub.get('providedProducts', [])
if any(product.get('productId') == '480' for product in products):
return True
return False
def is_appropriate_sat_sub(self, sub):
if 'Red Hat Ansible Automation' not in sub['subscription_name']:
return False
return True
def generate_license_options_from_entitlements(self, json):
def generate_license_options_from_entitlements(self, json, is_candlepin=False):
from dateutil.parser import parse
ValidSub = collections.namedtuple(
@@ -336,12 +380,14 @@ class Licenser(object):
satellite = sub.get('satellite')
if satellite:
is_valid = self.is_appropriate_sat_sub(sub)
elif is_candlepin:
is_valid = self.is_appropriate_sub(sub)
else:
# the list of subs from console.redhat.com are already valid based on the query params we provided
# the list of subs from console.redhat.com and subscriptions.rhsm.redhat.com are already valid based on the query params we provided
is_valid = True
if is_valid:
try:
if satellite:
if is_candlepin:
end_date = parse(sub.get('endDate'))
else:
end_date = parse(sub['subscriptions']['endDate'])
@@ -354,10 +400,10 @@ class Licenser(object):
continue
developer_license = False
support_level = ''
support_level = sub.get('support_level', '')
account_number = ''
usage = sub.get('usage', '')
if satellite:
if is_candlepin:
try:
quantity = int(sub['quantity'])
except Exception:
@@ -365,7 +411,6 @@ class Licenser(object):
sku = sub['productId']
subscription_id = sub['subscriptionId']
sub_name = sub['productName']
support_level = sub['support_level']
account_number = sub['accountNumber']
else:
try:
@@ -434,6 +479,8 @@ class Licenser(object):
license.update(subscription_id=sub.subscription_id)
license.update(account_number=sub.account_number)
licenses.append(license._attrs.copy())
# sort by sku
licenses.sort(key=lambda x: x['sku'])
return licenses
raise ValueError('No valid Red Hat Ansible Automation subscription could be found for this account.') # noqa

View File

@@ -19,18 +19,27 @@ short_description: Get subscription list
description:
- Get subscriptions available to Automation Platform Controller. See
U(https://www.ansible.com/tower) for an overview.
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
options:
username:
description:
- Red Hat username to get available subscriptions.
required: False
type: str
password:
description:
- Red Hat password to get available subscriptions.
required: False
type: str
client_id:
description:
- Red Hat service account client ID or Red Hat Satellite username to get available subscriptions.
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
required: True
- Red Hat service account client ID to get available subscriptions.
required: False
type: str
client_secret:
description:
- Red Hat service account client secret or Red Hat Satellite password to get available subscriptions.
- The credentials you use will be stored for future use in retrieving renewal or expanded subscriptions
required: True
- Red Hat service account client secret to get available subscriptions.
required: False
type: str
filters:
description:
@@ -72,19 +81,41 @@ def main():
module = ControllerAPIModule(
argument_spec=dict(
client_id=dict(type='str', required=True),
client_secret=dict(type='str', no_log=True, required=True),
username=dict(type='str', required=False),
password=dict(type='str', no_log=True, required=False),
client_id=dict(type='str', required=False),
client_secret=dict(type='str', no_log=True, required=False),
filters=dict(type='dict', required=False, default={}),
),
mutually_exclusive=[
['username', 'client_id']
],
required_together=[
['username', 'password'],
['client_id', 'client_secret']
],
required_one_of=[
['username', 'client_id']
],
)
json_output = {'changed': False}
username = module.params.get('username')
password = module.params.get('password')
client_id = module.params.get('client_id')
client_secret = module.params.get('client_secret')
if username and password:
post_data = {
'subscriptions_username': username,
'subscriptions_password': password,
}
else:
post_data = {
'subscriptions_client_id': client_id,
'subscriptions_client_secret': client_secret,
}
# Check if Tower is already licensed
post_data = {
'subscriptions_client_secret': module.params.get('client_secret'),
'subscriptions_client_id': module.params.get('client_id'),
}
all_subscriptions = module.post_endpoint('config/subscriptions', data=post_data)['json']
json_output['subscriptions'] = []
for subscription in all_subscriptions:

View File

@@ -82,7 +82,38 @@ class CLI(object):
return '--help' in self.argv or '-h' in self.argv
def authenticate(self):
"""Configure the current session for basic auth"""
"""Configure the current session for authentication.
Uses Basic authentication when AWXKIT_FORCE_BASIC_AUTH environment variable
is set to true, otherwise defaults to session-based authentication.
For AAP Gateway environments, set AWXKIT_FORCE_BASIC_AUTH=true to bypass
session login restrictions.
"""
# Check if Basic auth is forced via environment variable
if config.get('force_basic_auth', False):
config.use_sessions = False
# Validate credentials are provided
username = self.get_config('username')
password = self.get_config('password')
if not username or not password:
raise ValueError(
"Basic authentication requires both username and password. "
"Provide --conf.username and --conf.password or set "
"CONTROLLER_USERNAME and CONTROLLER_PASSWORD environment variables."
)
# Apply Basic auth credentials to the session
try:
self.root.connection.login(username, password)
self.root.get()
except Exception as e:
raise RuntimeError(f"Basic authentication failed: {str(e)}. " "Verify credentials and network connectivity.") from e
return
# Use session-based authentication (default)
config.use_sessions = True
self.root.load_session().get()

View File

@@ -32,6 +32,7 @@ config.assume_untrusted = config.get('assume_untrusted', True)
config.client_connection_attempts = int(os.getenv('AWXKIT_CLIENT_CONNECTION_ATTEMPTS', 5))
config.prevent_teardown = to_bool(os.getenv('AWXKIT_PREVENT_TEARDOWN', False))
config.use_sessions = to_bool(os.getenv('AWXKIT_SESSIONS', False))
config.force_basic_auth = to_bool(os.getenv('AWXKIT_FORCE_BASIC_AUTH', False))
config.api_base_path = os.getenv('CONTROLLER_OPTIONAL_API_URLPATTERN_PREFIX', '/api/')
config.api_base_path = os.getenv('AWXKIT_API_BASE_PATH', config.api_base_path)
config.gateway_base_path = os.getenv('AWXKIT_GATEWAY_BASE_PATH', '/api/gateway/')

View File

@@ -0,0 +1,103 @@
import pytest
from typing import Tuple, List, Optional
from unittest.mock import Mock
from awxkit.cli import CLI
from awxkit import config
@pytest.fixture(autouse=True)
def reset_config_state(monkeypatch: pytest.MonkeyPatch) -> None:
"""Ensure clean config state for each test to prevent parallel test interference"""
monkeypatch.setattr(config, 'force_basic_auth', False, raising=False)
monkeypatch.setattr(config, 'use_sessions', False, raising=False)
# ============================================================================
# Test Helper Functions
# ============================================================================
def setup_basic_auth(cli_args: Optional[List[str]] = None) -> Tuple[CLI, Mock, Mock]:
"""Set up CLI with mocked connection for Basic auth testing"""
cli = CLI()
cli.parse_args(cli_args or ['awx', '--conf.username', 'testuser', '--conf.password', 'testpass'])
mock_root = Mock()
mock_connection = Mock()
mock_root.connection = mock_connection
cli.root = mock_root
return cli, mock_root, mock_connection
def setup_session_auth(cli_args: Optional[List[str]] = None) -> Tuple[CLI, Mock, Mock]:
"""Set up CLI with mocked session for Session auth testing"""
cli = CLI()
cli.parse_args(cli_args or ['awx', '--conf.username', 'testuser', '--conf.password', 'testpass'])
mock_root = Mock()
mock_load_session = Mock()
mock_root.load_session.return_value = mock_load_session
cli.root = mock_root
return cli, mock_root, mock_load_session
def test_basic_auth_enabled(monkeypatch):
"""Test that AWXKIT_FORCE_BASIC_AUTH=true enables Basic authentication"""
cli, mock_root, mock_connection = setup_basic_auth()
monkeypatch.setattr(config, 'force_basic_auth', True)
cli.authenticate()
mock_connection.login.assert_called_once_with('testuser', 'testpass')
mock_root.get.assert_called_once()
assert not config.use_sessions
def test_session_auth_default(monkeypatch):
"""Test that session auth is used by default (backward compatibility)"""
cli, mock_root, mock_load_session = setup_session_auth()
monkeypatch.setattr(config, 'force_basic_auth', False)
cli.authenticate()
mock_root.load_session.assert_called_once()
mock_load_session.get.assert_called_once()
assert config.use_sessions
def test_aap_gateway_scenario(monkeypatch):
"""Test the specific AAP Gateway scenario from AAP-46830"""
cli, mock_root, mock_connection = setup_basic_auth(
['awx', '--conf.host', 'https://aap-sbx.cambiahealth.com', '--conf.username', 'puretest', '--conf.password', 'testpass']
)
monkeypatch.setattr(config, 'force_basic_auth', True)
cli.authenticate()
mock_connection.login.assert_called_once_with('puretest', 'testpass')
mock_root.get.assert_called_once()
assert not config.use_sessions
def test_empty_credentials_error(monkeypatch):
"""Test error handling for explicitly empty credentials"""
cli, mock_root, mock_connection = setup_basic_auth(['awx', '--conf.username', '', '--conf.password', ''])
monkeypatch.setattr(config, 'force_basic_auth', True)
with pytest.raises(ValueError, match="Basic authentication requires both username and password"):
cli.authenticate()
mock_connection.login.assert_not_called()
def test_connection_failure(monkeypatch):
"""Test error handling when Basic auth connection fails"""
cli, mock_root, mock_connection = setup_basic_auth()
mock_connection.login.side_effect = Exception("Connection failed")
monkeypatch.setattr(config, 'force_basic_auth', True)
with pytest.raises(RuntimeError, match="Basic authentication failed: Connection failed"):
cli.authenticate()
mock_connection.login.assert_called_once_with('testuser', 'testpass')
assert not config.use_sessions

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,725 @@
=====================================================
Django Development Requirements
=====================================================
**AWX Codebase Best Practices**
:Version: 1.0
:Date: September 2025
:Based on: AWX Enterprise Django Application Analysis
:Generated by: Claude Code AI
----
.. contents:: Table of Contents
:depth: 3
:local:
----
1. Project Structure
====================
1.1 Modular Application Architecture
------------------------------------
**REQUIRED**: Organize Django project with clear separation of concerns::
awx/
├── __init__.py # Version management and environment detection
├── main/ # Core business logic and models
├── api/ # REST API layer (Django REST Framework)
├── ui/ # Frontend integration
├── conf/ # Configuration management
├── settings/ # Environment-specific settings
├── templates/ # Django templates
└── static/ # Static assets
**Requirements**:
- Each functional area must have its own Django app
- Use descriptive app names that reflect business domains
- Separate API logic from core business logic
1.2 Pre-Management Command Code
--------------------------------
This section describes the code that runs before every management command.
AWX persistent services (i.e. wsrelay, heartbeat, dispatcher) all have management commands as entry points. So if you want to write a new persistent service, make a management command.
System jobs are implemented as management commands too.
**REQUIRED**: Implement custom Django management integration:
.. code-block:: python
# awx/__init__.py
def manage():
"""Custom management function with environment preparation"""
prepare_env()
from django.core.management import execute_from_command_line
# Version validation for production
if not MODE == 'development':
validate_production_requirements()
execute_from_command_line(sys.argv)
**Requirements**:
- Environment detection (development/production modes)
- Production deployment validation
- Custom version checking mechanisms
- Database version compatibility checks
----
2. Settings Management
======================
2.1 Environment-Based Settings Architecture
-------------------------------------------
**REQUIRED**: Use ``django-split-settings`` for modular configuration::
# settings/defaults.py - Base configuration
# settings/development.py - Development overrides
# settings/production.py - Production security settings
# settings/testing.py - Test-specific configuration
**Settings Pattern**:
.. code-block:: python
# development.py
from .defaults import *
from split_settings.tools import optional, include
DEBUG = True
ALLOWED_HOSTS = ['*']
# Include optional local settings
include(optional('local_settings.py'))
2.2 Sourcing config from files
-------------------------------
**REQUIRED**: Sourcing config from multiple files (in a directory) on disk:
.. code-block:: python
# External settings loading
EXTERNAL_SETTINGS = os.environ.get('AWX_SETTINGS_FILE')
if EXTERNAL_SETTINGS:
include(EXTERNAL_SETTINGS, scope=locals())
3. URL Patterns and Routing
============================
3.1 Modular URL Architecture
-----------------------------
**REQUIRED**: Implement hierarchical URL organization with namespacing:
.. code-block:: python
# urls.py
def get_urlpatterns(prefix=None):
"""Dynamic URL pattern generation with prefix support"""
if not prefix:
prefix = '/'
else:
prefix = f'/{prefix}/'
return [
path(f'api{prefix}', include('awx.api.urls', namespace='api')),
path(f'ui{prefix}', include('awx.ui.urls', namespace='ui')),
]
urlpatterns = get_urlpatterns()
3.2 Environment-Specific URL Inclusion
--------------------------------------
**REQUIRED**: Conditional URL patterns based on environment:
This example allows the Django debug toolbar to work.
.. code-block:: python
# Development-only URLs
if settings.DEBUG:
try:
import debug_toolbar
urlpatterns += [path('__debug__/', include(debug_toolbar.urls))]
except ImportError:
pass
**OPTIONAL**: If you want to include your own debug logic and endpoints:
.. code-block:: python
if MODE == 'development':
# Only include these if we are in the development environment
from awx.api.swagger import schema_view
from awx.api.urls.debug import urls as debug_urls
urlpatterns += [re_path(r'^debug/', include(debug_urls))]
urlpatterns += [
re_path(r'^swagger(?P<format>\.json|\.yaml)/$', schema_view.without_ui(cache_timeout=0), name='schema-json'),
re_path(r'^swagger/$', schema_view.with_ui('swagger', cache_timeout=0), name='schema-swagger-ui'),
re_path(r'^redoc/$', schema_view.with_ui('redoc', cache_timeout=0), name='schema-redoc'),
]
**Requirements**:
- Use Django's ``include()`` for modular organization
- Implement URL namespacing for API versioning
- Support dynamic URL prefix configuration
- Separate URL patterns by functional area
----
4. Model Design
===============
4.1 Abstract Base Models
------------------------
**REQUIRED**: Use abstract base models for common functionality:
.. code-block:: python
# models/base.py
class BaseModel(models.Model):
"""Common fields and methods for all models"""
created = models.DateTimeField(auto_now_add=True)
modified = models.DateTimeField(auto_now=True)
class Meta:
abstract = True
class AuditableModel(BaseModel):
"""Models requiring audit trail"""
created_by = models.ForeignKey(User, on_delete=models.CASCADE)
class Meta:
abstract = True
4.2 Mixin-Based Architecture
----------------------------
**REQUIRED**: Implement reusable model behaviors through mixins:
.. code-block:: python
# models/mixins.py
class ResourceMixin(models.Model):
"""Common resource management functionality"""
class Meta:
abstract = True
class ExecutionEnvironmentMixin(models.Model):
"""Execution environment configuration"""
class Meta:
abstract = True
4.3 Model Organization
----------------------
**REQUIRED**: Organize models by domain functionality::
models/
├── __init__.py
├── base.py # Abstract base models
├── mixins.py # Reusable model behaviors
├── inventory.py # Inventory-related models
├── jobs.py # Job execution models
├── credential.py # Credential management
└── organization.py # Organization models
**Requirements**:
- One file per logical domain until the domain gets too big, create a folder for it instead. In the past, credentials were broken out into logical domains until they were moved out of AWX, then they were collapsed back down to a single file.
- Use consistent naming conventions
- Implement comprehensive model validation
- Custom managers for complex queries
----
5. REST API Development
=======================
5.1 Custom Authentication Classes
----------------------------------
The recommended best practice is to log all of the terminal (return) paths of authentication, not just the successful ones.
**REQUIRED**: Implement domain-specific authentication with logging:
.. code-block:: python
# api/authentication.py
class LoggedBasicAuthentication(authentication.BasicAuthentication):
"""Basic authentication with request logging"""
def authenticate(self, request):
if not settings.AUTH_BASIC_ENABLED:
return
ret = super().authenticate(request)
if ret:
username = ret[0].username if ret[0] else '<none>'
logger.info(
f"User {username} performed {request.method} "
f"to {request.path} through the API"
)
return ret
5.2 Custom Permission Classes
-----------------------------
**REQUIRED**: Implement comprehensive permission checking:
.. code-block:: python
# api/permissions.py
class ModelAccessPermission(permissions.BasePermission):
"""Model-based access control with hierarchy support"""
def has_permission(self, request, view):
if hasattr(view, 'parent_model'):
parent_obj = view.get_parent_object()
return check_user_access(
request.user,
view.parent_model,
'read',
parent_obj
)
return True
**Requirements**:
- Multiple authentication methods (JWT, Session, Basic)
- Custom pagination, renderers, and metadata classes
- Comprehensive API exception handling
- Resource-based URL organization
- Logging for authentication events
----
6. Security Requirements
========================
6.1 Production Security Settings
--------------------------------
**REQUIRED**: Enforce secure defaults for production:
.. code-block:: python
# settings/production.py
DEBUG = False
SECRET_KEY = None # Force explicit configuration
ALLOWED_HOSTS = [] # Must be explicitly set
# Session security
SESSION_COOKIE_SECURE = True
SESSION_COOKIE_HTTPONLY = True
SESSION_COOKIE_SAMESITE = 'Lax'
SESSION_COOKIE_AGE = 1800
# CSRF protection
CSRF_COOKIE_SECURE = True
CSRF_COOKIE_HTTPONLY = True
CSRF_TRUSTED_ORIGINS = []
6.2 Django SECRET_KEY loading
------------------------------
**REQUIRED**: Implement Django SECRET_KEY loading:
.. code-block:: python
# Secret key from external file
SECRET_KEY_FILE = os.environ.get('SECRET_KEY_FILE', '/etc/awx/SECRET_KEY')
if os.path.exists(SECRET_KEY_FILE):
with open(SECRET_KEY_FILE, 'rb') as f:
SECRET_KEY = f.read().strip().decode()
else:
if not DEBUG:
raise ImproperlyConfigured("SECRET_KEY must be configured in production")
For more detail, refer to the `Django documentation <https://docs.djangoproject.com/en/5.2/ref/settings/#secret-key>`_.
6.3 Proxy and Network Security
------------------------------
**REQUIRED**: Configure reverse proxy security:
.. code-block:: python
# Proxy configuration
REMOTE_HOST_HEADERS = ['REMOTE_ADDR', 'REMOTE_HOST']
PROXY_IP_ALLOWED_LIST = []
USE_X_FORWARDED_HOST = True
USE_X_FORWARDED_PORT = True
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
**Requirements**:
- External secret file management
- Secure cookie configuration
- CSRF protection with trusted origins
- Proxy header validation
- Force HTTPS in production
----
7. Database Management
======================
7.1 Advanced Database Configuration
-----------------------------------
**REQUIRED**: Robust database connections for production:
.. code-block:: python
# Database configuration with connection tuning
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DATABASE_NAME', 'awx'),
'ATOMIC_REQUESTS': True,
'CONN_MAX_AGE': 0,
'OPTIONS': {
'keepalives': 1,
'keepalives_idle': 5,
'keepalives_interval': 5,
'keepalives_count': 5,
},
}
}
7.2 Database Version Validation
-------------------------------
**REQUIRED**: Implement database compatibility checking:
.. code-block:: python
# PostgreSQL version enforcement
def validate_database_version():
from django.db import connection
if (connection.pg_version // 10000) < 12:
raise ImproperlyConfigured(
"PostgreSQL version 12 or higher is required"
)
7.3 Migration Management
------------------------
**REQUIRED**: Structured migration organization
::
migrations/
├── 0001_initial.py
├── 0002_squashed_v300_release.py
├── 0003_squashed_v300_v303_updates.py
└── _migration_utils.py
**Requirements**:
It is best practice to not to re-write migrations. If possible, include a reverse migration, especially for data migrations to make testing easier.
----
8. Testing Standards
====================
8.1 Pytest Configuration
-------------------------
**REQUIRED**: Comprehensive test setup with optimization:
.. code-block:: ini
# pytest.ini
[pytest]
DJANGO_SETTINGS_MODULE = awx.main.tests.settings_for_test
python_files = *.py
addopts = --reuse-db --nomigrations --tb=native
markers =
ac: access control test
survey: tests related to survey feature
inventory_import: tests of code used by inventory import command
integration: integration tests requiring external services
8.2 Test Settings Module
-------------------------
**REQUIRED**: Dedicated test configuration:
.. code-block:: python
# settings/testing.py
from .defaults import *
# Fast test database
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3'
DATABASES['default']['NAME'] = ':memory:'
# Disable migrations for speed
class DisableMigrations:
def __contains__(self, item):
return True
def __getitem__(self, item):
return None
MIGRATION_MODULES = DisableMigrations()
8.3 Coverage Requirements
-------------------------
**REQUIRED**: Enforce comprehensive test coverage:
.. code-block:: python
# Coverage targets
COVERAGE_TARGETS = {
'project_overall': 75,
'library_code': 75,
'test_code': 95,
'new_patches': 100,
'type_checking': 100,
}
**Requirements**:
- Database reuse for faster execution
- Skip migrations in tests
- Custom test markers for categorization
- Dedicated test settings module
- Comprehensive warning filters
----
9. Application Configuration
=============================
9.1 Advanced AppConfig Implementation
--------------------------------------
**REQUIRED**: Custom application configuration with initialization:
.. code-block:: python
# apps.py
class MainConfig(AppConfig):
name = 'awx.main'
verbose_name = _('Main')
default_auto_field = 'django.db.models.AutoField'
def ready(self):
super().ready()
# Feature loading with environment checks
if not os.environ.get('AWX_SKIP_FEATURES', None):
self.load_credential_types()
self.load_inventory_plugins()
self.load_named_urls()
# Signal registration
self.register_signals()
def load_credential_types(self):
"""Load credential type definitions"""
pass
def register_signals(self):
"""Register Django signals"""
pass
**Requirements**:
- Custom AppConfig for complex initialization
- Feature loading in ``ready()`` method
- Environment-based feature toggling
- Plugin system integration
- Signal registration
----
10. Middleware Implementation
=============================
10.1 Custom Middleware for Enterprise Features
----------------------------------------------
**REQUIRED**: Implement domain-specific middleware:
.. code-block:: python
# middleware.py
class SettingsCacheMiddleware(MiddlewareMixin):
"""Clear settings cache on each request"""
def process_request(self, request):
from django.conf import settings
if hasattr(settings, '_awx_conf_memoizedcache'):
settings._awx_conf_memoizedcache.clear()
class TimingMiddleware(threading.local, MiddlewareMixin):
"""Request timing and performance monitoring"""
def process_request(self, request):
self.start_time = time.time()
def process_response(self, request, response):
if hasattr(self, 'start_time'):
duration = time.time() - self.start_time
response['X-Response-Time'] = f"{duration:.3f}s"
return response
**Requirements**:
- Settings cache management middleware
- Performance monitoring middleware
- Thread-local storage for request data
- Conditional middleware activation
----
11. Deployment Patterns
========================
11.1 Production-Ready ASGI/WSGI Configuration
---------------------------------------------
**REQUIRED**: Proper application server setup:
.. code-block:: python
# asgi.py
import os
import django
from channels.routing import get_default_application
from awx import prepare_env
prepare_env()
django.setup()
application = get_default_application()
# wsgi.py
import os
from django.core.wsgi import get_wsgi_application
from awx import prepare_env
prepare_env()
application = get_wsgi_application()
----
Compliance Checklist
=====================
Development Standards
---------------------
.. list-table::
:header-rows: 1
:widths: 50 10
* - Requirement
- Status
* - Modular app architecture implemented
- ☐
* - Environment-based settings configured
- ☐
* - Custom authentication and permissions
- ☐
* - Comprehensive test coverage (>75%)
- ☐
* - Security settings enforced
- ☐
* - Database optimization configured
- ☐
* - Static files properly organized
- ☐
* - Custom middleware implemented
- ☐
Production Readiness
--------------------
.. list-table::
:header-rows: 1
:widths: 50 10
* - Requirement
- Status
* - External secret management
- ☐
* - Database version validation
- ☐
* - Version deployment verification
- ☐
* - Performance monitoring
- ☐
* - Security headers configured
- ☐
* - HTTPS enforcement
- ☐
* - Proper logging setup
- ☐
* - Error handling and monitoring
- ☐
Code Quality
------------
.. list-table::
:header-rows: 1
:widths: 50 10
* - Requirement
- Status
* - Abstract base models used
- ☐
* - Mixin-based architecture
- ☐
* - Custom management commands
- ☐
* - Plugin system support
- ☐
* - Signal registration
- ☐
* - Migration organization
- ☐
* - API documentation
- ☐
* - Type hints and validation
- ☐
----
References
==========
- **Django Documentation**: https://docs.djangoproject.com/
- **Django REST Framework**: https://www.django-rest-framework.org/
- **Django Split Settings**: https://github.com/sobolevn/django-split-settings
- **AWX Source Code**: https://github.com/ansible/awx
----
| **Document Maintainer**: Development Team
| **Last Updated**: September 2025
| **Review Schedule**: Quarterly

View File

@@ -15,6 +15,8 @@ Ansible AWX helps teams manage complex multi-tier deployments by adding control,
:caption: Community
contributor/index
contributor/DJANGO_REQUIREMENTS
contributor/API_REQUIREMENTS
.. toctree::
:maxdepth: 2

View File

@@ -1,21 +0,0 @@
The MIT License (MIT)
Copyright (c) 2017 Laurent LAPORTE
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

Binary file not shown.

View File

@@ -1,165 +0,0 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
This version of the GNU Lesser General Public License incorporates
the terms and conditions of version 3 of the GNU General Public
License, supplemented by the additional permissions listed below.
0. Additional Definitions.
As used herein, "this License" refers to version 3 of the GNU Lesser
General Public License, and the "GNU GPL" refers to version 3 of the GNU
General Public License.
"The Library" refers to a covered work governed by this License,
other than an Application or a Combined Work as defined below.
An "Application" is any work that makes use of an interface provided
by the Library, but which is not otherwise based on the Library.
Defining a subclass of a class defined by the Library is deemed a mode
of using an interface provided by the Library.
A "Combined Work" is a work produced by combining or linking an
Application with the Library. The particular version of the Library
with which the Combined Work was made is also called the "Linked
Version".
The "Minimal Corresponding Source" for a Combined Work means the
Corresponding Source for the Combined Work, excluding any source code
for portions of the Combined Work that, considered in isolation, are
based on the Application, and not on the Linked Version.
The "Corresponding Application Code" for a Combined Work means the
object code and/or source code for the Application, including any data
and utility programs needed for reproducing the Combined Work from the
Application, but excluding the System Libraries of the Combined Work.
1. Exception to Section 3 of the GNU GPL.
You may convey a covered work under sections 3 and 4 of this License
without being bound by section 3 of the GNU GPL.
2. Conveying Modified Versions.
If you modify a copy of the Library, and, in your modifications, a
facility refers to a function or data to be supplied by an Application
that uses the facility (other than as an argument passed when the
facility is invoked), then you may convey a copy of the modified
version:
a) under this License, provided that you make a good faith effort to
ensure that, in the event an Application does not supply the
function or data, the facility still operates, and performs
whatever part of its purpose remains meaningful, or
b) under the GNU GPL, with none of the additional permissions of
this License applicable to that copy.
3. Object Code Incorporating Material from Library Header Files.
The object code form of an Application may incorporate material from
a header file that is part of the Library. You may convey such object
code under terms of your choice, provided that, if the incorporated
material is not limited to numerical parameters, data structure
layouts and accessors, or small macros, inline functions and templates
(ten or fewer lines in length), you do both of the following:
a) Give prominent notice with each copy of the object code that the
Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the object code with a copy of the GNU GPL and this license
document.
4. Combined Works.
You may convey a Combined Work under terms of your choice that,
taken together, effectively do not restrict modification of the
portions of the Library contained in the Combined Work and reverse
engineering for debugging such modifications, if you also do each of
the following:
a) Give prominent notice with each copy of the Combined Work that
the Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the Combined Work with a copy of the GNU GPL and this license
document.
c) For a Combined Work that displays copyright notices during
execution, include the copyright notice for the Library among
these notices, as well as a reference directing the user to the
copies of the GNU GPL and this license document.
d) Do one of the following:
0) Convey the Minimal Corresponding Source under the terms of this
License, and the Corresponding Application Code in a form
suitable for, and under terms that permit, the user to
recombine or relink the Application with a modified version of
the Linked Version to produce a modified Combined Work, in the
manner specified by section 6 of the GNU GPL for conveying
Corresponding Source.
1) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (a) uses at run time
a copy of the Library already present on the user's computer
system, and (b) will operate properly with a modified version
of the Library that is interface-compatible with the Linked
Version.
e) Provide Installation Information, but only if you would otherwise
be required to provide such information under section 6 of the
GNU GPL, and only to the extent that such information is
necessary to install and execute a modified version of the
Combined Work produced by recombining or relinking the
Application with a modified version of the Linked Version. (If
you use option 4d0, the Installation Information must accompany
the Minimal Corresponding Source and Corresponding Application
Code. If you use option 4d1, you must provide the Installation
Information in the manner specified by section 6 of the GNU GPL
for conveying Corresponding Source.)
5. Combined Libraries.
You may place library facilities that are a work based on the
Library side by side in a single library together with other library
facilities that are not Applications and are not covered by this
License, and convey such a combined library under terms of your
choice, if you do both of the following:
a) Accompany the combined library with a copy of the same work based
on the Library, uncombined with any other library facilities,
conveyed under the terms of this License.
b) Give prominent notice with the combined library that part of it
is a work based on the Library, and explaining where to find the
accompanying uncombined form of the same work.
6. Revised Versions of the GNU Lesser General Public License.
The Free Software Foundation may publish revised and/or new versions
of the GNU Lesser General Public License from time to time. Such new
versions will be similar in spirit to the present version, but may
differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the
Library as you received it specifies that a certain numbered version
of the GNU Lesser General Public License "or any later version"
applies to it, you have the option of following the terms and
conditions either of that published version or of any later version
published by the Free Software Foundation. If the Library as you
received it does not specify a version number of the GNU Lesser
General Public License, you may choose any version of the GNU Lesser
General Public License ever published by the Free Software Foundation.
If the Library as you received it specifies that a proxy can decide
whether future versions of the GNU Lesser General Public License shall
apply, that proxy's public statement of acceptance of any version is
permanent authorization for you to choose that version for the
Library.

View File

@@ -1,11 +0,0 @@
Copyright 2022 Rick van Hattem
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -1,26 +1,27 @@
Copyright (c) 2013, Massimiliano Pippi, Federico Frenguelli and contributors
Copyright (c) 2016, Gregory Szorc
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
The views and conclusions contained in the software and documentation are those
of the authors and should not be interpreted as representing official policies,
either expressed or implied, of the FreeBSD Project.

View File

@@ -49,29 +49,19 @@ Make sure to delete the old tarball if it is an upgrade.
Anything pinned in `*.in` files involves additional manual work in
order to upgrade. Some information related to that work is outlined here.
### django-oauth-toolkit
### pip, setuptools and setuptools_scm, wheel, cython
Versions later than 1.4.1 throw an error about id_token_id, due to the
OpenID Connect work that was done in
https://github.com/jazzband/django-oauth-toolkit/pull/915. This may
be fixable by creating a migration on our end?
### pip, setuptools and setuptools_scm
If modifying these libraries make sure testing with the offline build is performed to confirm they are functionally working.
Versions need to match the versions used in the pip bootstrapping step
in the top-level Makefile.
If modifying these libraries make sure testing with the offline build is performed to confirm
they are functionally working. Versions need to match the versions used in the pip bootstrapping
step in the top-level Makefile.
Verify ansible-runner's build dependency doesn't conflict with the changes made.
### cryptography
If modifying this library make sure testing with the offline build is performed to confirm it is functionally working.
## Library Notes
### pexpect
Version 4.8 makes us a little bit nervous with changes to `searchwindowsize` https://github.com/pexpect/pexpect/pull/579/files
Pin to `pexpect==4.7.x` until we have more time to move to `4.8` and test.
### urllib3 and OPA-python-client
There are incompatible version dependancies for urllib3 between OPA-python-client and kubernetes.
OPA-python-client v2.0.3+ requires urllib3 v2.5.0+ and kubernetes v34.1.0 caps it at v.2.4.0.
## djangorestframework
Upgrading to 3.16.1 introduced errors on the tests around CredentialInputSource. We have several
fields on that model set to default=null but in the serializer they're set to required: true which causes
a conflict.

View File

@@ -9,7 +9,7 @@ boto3
botocore
channels
channels-redis
cryptography<42.0.0 # investigation is needed for 42+ to work with OpenSSL v3.0.x (RHEL 9.4) and v3.2.x (RHEL 9.5)
cryptography
Cython
daphne
distro
@@ -18,12 +18,11 @@ django-cors-headers
django-crum
django-extensions
django-guid
django-oauth-toolkit<2.0.0 # Version 2.0.0 has breaking changes that will need to be worked out before upgrading
django-polymorphic
django-solo
djangorestframework>=3.15.0
djangorestframework==3.15.2 # upgrading to 3.16+ throws NOT_REQUIRED_DEFAULT error on required fields in serializer that have no default
djangorestframework-yaml
dynaconf<4
dynaconf
filelock
GitPython>=3.1.37 # CVE-2023-41040
grpcio
@@ -35,20 +34,20 @@ Markdown # used for formatting API help
maturin # pydantic-core build dep
msgpack
msrestazure
OPA-python-client==2.0.2 # Code contain monkey patch targeted to 2.0.2 to fix https://github.com/Turall/OPA-python-client/issues/29
OPA-python-client==2.0.2 # upgrading requires urllib3 2.5.0+ which is blocked by other deps
openshift
opentelemetry-api~=1.24 # new y streams can be drastically different, in a good way
opentelemetry-sdk~=1.24
opentelemetry-api~=1.37 # new y streams can be drastically different, in a good way
opentelemetry-sdk~=1.37
opentelemetry-instrumentation-logging
opentelemetry-exporter-otlp
pexpect==4.7.0 # see library notes
pexpect
prometheus_client
psycopg
psutil
pygerduty
PyGithub <= 2.6.0
pyopenssl>=23.2.0 # resolve dep conflict from cryptography pin above
pyparsing==2.4.6 # Upgrading to v3 of pyparsing introduce errors on smart host filtering: Expected 'or' term, found 'or' (at char 15), (line:1, col:16)
PyGithub
pyopenssl
pyparsing==2.4.7 # Upgrading to v3 of pyparsing introduce errors on smart host filtering: Expected 'or' term, found 'or' (at char 15), (line:1, col:16)
python-daemon
python-dsv-sdk>=1.0.4
python-tss-sdk>=1.2.1
@@ -61,13 +60,13 @@ requests
slack-sdk
twilio
twisted[tls]>=24.7.0 # CVE-2024-41810
urllib3>=1.26.19 # CVE-2024-37891
urllib3<2.4.0, >=1.26.19 # CVE-2024-37891. capped by kubernetes 34.1.0 reqs
uWSGI>=2.0.28
uwsgitop
wheel>=0.38.1 # CVE-2022-40898
pip==21.2.4 # see UPGRADE BLOCKERs
setuptools==80.9.0 # see UPGRADE BLOCKERs
setuptools_scm[toml] # see UPGRADE BLOCKERs, xmlsec build dep
setuptools_scm[toml]
setuptools-rust>=0.11.4 # cryptography build dep
pkgconfig>=1.5.1 # xmlsec build dep - needed for offline build
django-flags>=5.0.13

View File

@@ -1,20 +1,20 @@
adal==1.2.7
# via msrestazure
aiodns==3.2.0
aiodns==3.5.0
# via aiohttp
aiofiles==24.1.0
# via opa-python-client
aiohappyeyeballs==2.4.4
aiohappyeyeballs==2.6.1
# via aiohttp
aiohttp[speedups]==3.11.11
aiohttp[speedups]==3.13.0
# via
# -r /awx_devel/requirements/requirements.in
# aiohttp-retry
# opa-python-client
# twilio
aiohttp-retry==2.8.3
aiohttp-retry==2.9.1
# via twilio
aiosignal==1.3.2
aiosignal==1.4.0
# via aiohttp
ansi2html==1.9.2
# via -r /awx_devel/requirements/requirements.in
@@ -22,7 +22,7 @@ ansi2html==1.9.2
# via -r /awx_devel/requirements/requirements_git.txt
asciichartpy==1.5.25
# via -r /awx_devel/requirements/requirements.in
asgiref==3.8.1
asgiref==3.10.0
# via
# channels
# channels-redis
@@ -30,9 +30,9 @@ asgiref==3.8.1
# django
# django-ansible-base
# django-cors-headers
asn1==2.7.1
asn1==3.1.0
# via -r /awx_devel/requirements/requirements.in
attrs==24.3.0
attrs==25.4.0
# via
# aiohttp
# jsonschema
@@ -43,7 +43,7 @@ autobahn==24.4.2
# via daphne
autocommand==2.2.2
# via jaraco-text
automat==24.8.1
automat==25.4.16
# via twisted
# awx-plugins-core @ git+https://github.com/ansible/awx-plugins.git@devel # git requirements installed separately
# via -r /awx_devel/requirements/requirements_git.txt
@@ -51,35 +51,35 @@ awx-plugins.interfaces @ git+https://github.com/ansible/awx_plugins.interfaces.g
# via
# -r /awx_devel/requirements/requirements_git.txt
# awx-plugins-core
azure-core==1.32.0
azure-core==1.35.1
# via
# azure-identity
# azure-keyvault-certificates
# azure-keyvault-keys
# azure-keyvault-secrets
# msrest
azure-identity==1.19.0
azure-identity==1.25.1
# via -r /awx_devel/requirements/requirements.in
azure-keyvault==4.2.0
# via -r /awx_devel/requirements/requirements.in
azure-keyvault-certificates==4.9.0
azure-keyvault-certificates==4.10.0
# via azure-keyvault
azure-keyvault-keys==4.10.0
azure-keyvault-keys==4.11.0
# via azure-keyvault
azure-keyvault-secrets==4.9.0
azure-keyvault-secrets==4.10.0
# via azure-keyvault
backports-tarfile==1.2.0
# via jaraco-context
boto3==1.35.96
boto3==1.40.46
# via -r /awx_devel/requirements/requirements.in
botocore==1.35.96
botocore==1.40.46
# via
# -r /awx_devel/requirements/requirements.in
# boto3
# s3transfer
brotli==1.1.0
# via aiohttp
cachetools==5.5.0
cachetools==6.2.0
# via google-auth
# git+https://github.com/ansible/system-certifi.git@devel # git requirements installed separately
# via
@@ -87,24 +87,24 @@ cachetools==5.5.0
# kubernetes
# msrest
# requests
cffi==1.17.1
cffi==2.0.0
# via
# cryptography
# pycares
# pynacl
channels==4.2.0
channels==4.3.1
# via
# -r /awx_devel/requirements/requirements.in
# channels-redis
channels-redis==4.2.1
channels-redis==4.3.0
# via -r /awx_devel/requirements/requirements.in
charset-normalizer==3.4.1
charset-normalizer==3.4.3
# via requests
click==8.1.8
# via receptorctl
constantly==23.10.4
# via twisted
cryptography==41.0.7
cryptography==46.0.2
# via
# -r /awx_devel/requirements/requirements.in
# adal
@@ -112,22 +112,14 @@ cryptography==41.0.7
# azure-identity
# azure-keyvault-keys
# django-ansible-base
# jwcrypto
# msal
# pyjwt
# pyopenssl
# service-identity
cython==3.1.3
# via -r /awx_devel/requirements/requirements.in
daphne==4.1.2
daphne==4.2.1
# via -r /awx_devel/requirements/requirements.in
deprecated==1.2.15
# via
# opentelemetry-api
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
# opentelemetry-semantic-conventions
# pygithub
dispatcherd==2025.5.21
# via -r /awx_devel/requirements/requirements.in
distro==1.9.0
@@ -142,29 +134,26 @@ django==4.2.21
# django-extensions
# django-flags
# django-guid
# django-oauth-toolkit
# django-polymorphic
# django-solo
# djangorestframework
# django-ansible-base @ git+https://github.com/ansible/django-ansible-base@devel # git requirements installed separately
# via -r /awx_devel/requirements/requirements_git.txt
django-cors-headers==4.6.0
django-cors-headers==4.9.0
# via -r /awx_devel/requirements/requirements.in
django-crum==0.7.9
# via
# -r /awx_devel/requirements/requirements.in
# django-ansible-base
django-extensions==3.2.3
django-extensions==4.1
# via -r /awx_devel/requirements/requirements.in
django-flags==5.0.13
django-flags==5.0.14
# via
# -r /awx_devel/requirements/requirements.in
# django-ansible-base
django-guid==3.5.0
django-guid==3.5.2
# via -r /awx_devel/requirements/requirements.in
django-oauth-toolkit==1.7.1
# via -r /awx_devel/requirements/requirements.in
django-polymorphic==3.1.0
django-polymorphic==4.1.0
# via -r /awx_devel/requirements/requirements.in
django-solo==2.4.0
# via -r /awx_devel/requirements/requirements.in
@@ -174,35 +163,35 @@ djangorestframework==3.15.2
# django-ansible-base
djangorestframework-yaml==2.0.0
# via -r /awx_devel/requirements/requirements.in
durationpy==0.9
durationpy==0.10
# via kubernetes
dynaconf==3.2.10
dynaconf==3.2.11
# via
# -r /awx_devel/requirements/requirements.in
# django-ansible-base
enum-compat==0.0.3
# via asn1
filelock==3.16.1
filelock==3.19.1
# via -r /awx_devel/requirements/requirements.in
frozenlist==1.5.0
frozenlist==1.8.0
# via
# aiohttp
# aiosignal
gitdb==4.0.12
# via gitpython
gitpython==3.1.44
gitpython==3.1.45
# via -r /awx_devel/requirements/requirements.in
google-auth==2.37.0
google-auth==2.41.1
# via kubernetes
googleapis-common-protos==1.66.0
googleapis-common-protos==1.70.0
# via
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
grpcio==1.69.0
grpcio==1.75.1
# via
# -r /awx_devel/requirements/requirements.in
# opentelemetry-exporter-otlp-proto-grpc
hiredis==3.1.0
hiredis==3.2.1
# via redis
hyperlink==21.0.0
# via
@@ -215,7 +204,7 @@ idna==3.10
# requests
# twisted
# yarl
importlib-metadata==8.5.0
importlib-metadata==8.7.0
# via opentelemetry-api
importlib-resources==6.5.2
# via irc
@@ -231,16 +220,16 @@ isodate==0.7.2
# azure-keyvault-keys
# azure-keyvault-secrets
# msrest
jaraco-collections==5.1.0
jaraco-collections==5.2.1
# via irc
jaraco-context==6.0.1
# via jaraco-text
jaraco-functools==4.1.0
jaraco-functools==4.3.0
# via
# irc
# jaraco-text
# tempora
jaraco-logging==3.3.0
jaraco-logging==3.4.0
# via irc
jaraco-stream==3.0.4
# via irc
@@ -248,45 +237,43 @@ jaraco-text==4.0.0
# via
# irc
# jaraco-collections
jinja2==3.1.5
jinja2==3.1.6
# via -r /awx_devel/requirements/requirements.in
jmespath==1.0.1
# via
# boto3
# botocore
jq==1.8.0
jq==1.10.0
# via -r /awx_devel/requirements/requirements.in
json-log-formatter==1.1
json-log-formatter==1.1.1
# via -r /awx_devel/requirements/requirements.in
jsonschema==4.23.0
jsonschema==4.25.1
# via -r /awx_devel/requirements/requirements.in
jsonschema-specifications==2024.10.1
jsonschema-specifications==2025.9.1
# via jsonschema
jwcrypto==1.5.6
# via django-oauth-toolkit
kubernetes==31.0.0
kubernetes==34.1.0
# via openshift
lockfile==0.12.2
# via python-daemon
markdown==3.7
markdown==3.9
# via -r /awx_devel/requirements/requirements.in
markupsafe==3.0.2
markupsafe==3.0.3
# via jinja2
maturin==1.8.1
maturin==1.9.6
# via -r /awx_devel/requirements/requirements.in
more-itertools==10.5.0
more-itertools==10.8.0
# via
# irc
# jaraco-functools
# jaraco-stream
# jaraco-text
msal==1.31.1
msal==1.34.0
# via
# azure-identity
# msal-extensions
msal-extensions==1.2.0
msal-extensions==1.3.1
# via azure-identity
msgpack==1.1.0
msgpack==1.1.1
# via
# -r /awx_devel/requirements/requirements.in
# channels-redis
@@ -294,20 +281,17 @@ msrest==0.7.1
# via msrestazure
msrestazure==0.6.4.post1
# via -r /awx_devel/requirements/requirements.in
multidict==6.1.0
multidict==6.7.0
# via
# aiohttp
# yarl
oauthlib==3.2.2
# via
# django-oauth-toolkit
# kubernetes
# requests-oauthlib
oauthlib==3.3.1
# via requests-oauthlib
opa-python-client==2.0.2
# via -r /awx_devel/requirements/requirements.in
openshift==0.13.2
# via -r /awx_devel/requirements/requirements.in
opentelemetry-api==1.29.0
opentelemetry-api==1.37.0
# via
# -r /awx_devel/requirements/requirements.in
# opentelemetry-exporter-otlp-proto-grpc
@@ -316,60 +300,62 @@ opentelemetry-api==1.29.0
# opentelemetry-instrumentation-logging
# opentelemetry-sdk
# opentelemetry-semantic-conventions
opentelemetry-exporter-otlp==1.29.0
opentelemetry-exporter-otlp==1.37.0
# via -r /awx_devel/requirements/requirements.in
opentelemetry-exporter-otlp-proto-common==1.29.0
opentelemetry-exporter-otlp-proto-common==1.37.0
# via
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
opentelemetry-exporter-otlp-proto-grpc==1.29.0
opentelemetry-exporter-otlp-proto-grpc==1.37.0
# via opentelemetry-exporter-otlp
opentelemetry-exporter-otlp-proto-http==1.29.0
opentelemetry-exporter-otlp-proto-http==1.37.0
# via opentelemetry-exporter-otlp
opentelemetry-instrumentation==0.50b0
opentelemetry-instrumentation==0.58b0
# via opentelemetry-instrumentation-logging
opentelemetry-instrumentation-logging==0.50b0
opentelemetry-instrumentation-logging==0.58b0
# via -r /awx_devel/requirements/requirements.in
opentelemetry-proto==1.29.0
opentelemetry-proto==1.37.0
# via
# opentelemetry-exporter-otlp-proto-common
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
opentelemetry-sdk==1.29.0
opentelemetry-sdk==1.37.0
# via
# -r /awx_devel/requirements/requirements.in
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
opentelemetry-semantic-conventions==0.50b0
opentelemetry-semantic-conventions==0.58b0
# via
# opentelemetry-instrumentation
# opentelemetry-sdk
packaging==24.2
packaging==25.0
# via
# ansible-runner
# django-guid
# opentelemetry-instrumentation
# setuptools-scm
pexpect==4.7.0
pbr==7.0.1
# via -r /awx_devel/requirements/requirements.in
pexpect==4.9.0
# via
# -r /awx_devel/requirements/requirements.in
# ansible-runner
pkgconfig==1.5.5
# via -r /awx_devel/requirements/requirements.in
portalocker==2.10.1
# via msal-extensions
prometheus-client==0.21.1
prometheus-client==0.23.1
# via -r /awx_devel/requirements/requirements.in
propcache==0.2.1
propcache==0.4.0
# via
# aiohttp
# yarl
protobuf==5.29.3
protobuf==6.32.1
# via
# -r /awx_devel/requirements/requirements.in
# googleapis-common-protos
# opentelemetry-proto
psutil==6.1.1
psutil==7.1.0
# via -r /awx_devel/requirements/requirements.in
psycopg==3.2.6
psycopg==3.2.10
# via -r /awx_devel/requirements/requirements.in
ptyprocess==0.7.0
# via pexpect
@@ -378,18 +364,20 @@ pyasn1==0.6.1
# pyasn1-modules
# rsa
# service-identity
pyasn1-modules==0.4.1
pyasn1-modules==0.4.2
# via
# google-auth
# service-identity
pycares==4.5.0
pycares==4.11.0
# via aiodns
pycparser==2.22
pycparser==2.23
# via cffi
pygerduty==0.38.3
# via -r /awx_devel/requirements/requirements.in
pygithub==2.6.1
# via awx-plugins-core
pygithub==2.8.1
# via
# -r /awx_devel/requirements/requirements.in
# awx-plugins-core
pyjwt[crypto]==2.10.1
# via
# adal
@@ -397,13 +385,13 @@ pyjwt[crypto]==2.10.1
# msal
# pygithub
# twilio
pynacl==1.5.0
pynacl==1.6.0
# via pygithub
pyopenssl==24.3.0
pyopenssl==25.3.0
# via
# -r /awx_devel/requirements/requirements.in
# twisted
pyparsing==2.4.6
pyparsing==2.4.7
# via -r /awx_devel/requirements/requirements.in
python-daemon==3.1.2
# via
@@ -420,11 +408,11 @@ python-dsv-sdk==1.0.4
# via -r /awx_devel/requirements/requirements.in
python-string-utils==1.0.0
# via openshift
python-tss-sdk==1.2.3
python-tss-sdk==2.0.0
# via -r /awx_devel/requirements/requirements.in
pytz==2024.2
pytz==2025.2
# via irc
pyyaml==6.0.2
pyyaml==6.0.3
# via
# -r /awx_devel/requirements/requirements.in
# ansible-runner
@@ -432,25 +420,24 @@ pyyaml==6.0.2
# djangorestframework-yaml
# kubernetes
# receptorctl
pyzstd==0.16.2
pyzstd==0.18.0
# via -r /awx_devel/requirements/requirements.in
receptorctl==1.5.2
receptorctl==1.6.0
# via -r /awx_devel/requirements/requirements.in
redis[hiredis]==5.2.1
redis[hiredis]==6.4.0
# via
# -r /awx_devel/requirements/requirements.in
# channels-redis
referencing==0.35.1
referencing==0.36.2
# via
# jsonschema
# jsonschema-specifications
requests==2.32.3
requests==2.32.5
# via
# -r /awx_devel/requirements/requirements.in
# adal
# azure-core
# django-ansible-base
# django-oauth-toolkit
# kubernetes
# msal
# msrest
@@ -465,13 +452,13 @@ requests-oauthlib==2.0.0
# via
# kubernetes
# msrest
rpds-py==0.22.3
rpds-py==0.27.1
# via
# jsonschema
# referencing
rsa==4.9
rsa==4.9.1
# via google-auth
s3transfer==0.10.4
s3transfer==0.14.0
# via boto3
semantic-version==2.10.0
# via setuptools-rust
@@ -489,37 +476,46 @@ six==1.17.0
# openshift
# pygerduty
# python-dateutil
slack-sdk==3.34.0
slack-sdk==3.37.0
# via -r /awx_devel/requirements/requirements.in
smmap==5.0.2
# via gitdb
sqlparse==0.5.3
# via
# -r /awx_devel/requirements/requirements.in
# django
# django-ansible-base
tempora==5.8.0
tempora==5.8.1
# via
# irc
# jaraco-logging
twilio==9.4.2
twilio==9.8.3
# via -r /awx_devel/requirements/requirements.in
twisted[tls]==24.11.0
twisted[tls]==25.5.0
# via
# -r /awx_devel/requirements/requirements.in
# daphne
txaio==23.1.1
txaio==25.9.2
# via autobahn
typing-extensions==4.12.2
typing-extensions==4.15.0
# via
# aiosignal
# azure-core
# azure-identity
# azure-keyvault-certificates
# azure-keyvault-keys
# azure-keyvault-secrets
# jwcrypto
# grpcio
# opentelemetry-api
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
# opentelemetry-sdk
# opentelemetry-semantic-conventions
# psycopg
# pygithub
# pyopenssl
# pyzstd
# referencing
# twisted
urllib3==2.3.0
# via
@@ -529,7 +525,7 @@ urllib3==2.3.0
# kubernetes
# pygithub
# requests
uwsgi==2.0.28
uwsgi==2.0.30
# via -r /awx_devel/requirements/requirements.in
uwsgitop==0.12
# via -r /awx_devel/requirements/requirements.in
@@ -537,16 +533,16 @@ websocket-client==1.8.0
# via kubernetes
wheel==0.42.0
# via -r /awx_devel/requirements/requirements.in
wrapt==1.17.0
# via
# deprecated
# opentelemetry-instrumentation
yarl==1.18.3
wrapt==1.17.3
# via opentelemetry-instrumentation
yarl==1.22.0
# via aiohttp
zipp==3.21.0
zipp==3.23.0
# via importlib-metadata
zope-interface==7.2
zope-interface==8.0.1
# via twisted
zstandard==0.25.0
# via aiohttp
# The following packages are considered to be unsafe in a requirements file:
pip==21.2.4
@@ -557,6 +553,6 @@ setuptools==80.9.0
# asciichartpy
# autobahn
# incremental
# pbr
# setuptools-rust
# setuptools-scm
# zope-interface

141
sonar-project.properties Normal file
View File

@@ -0,0 +1,141 @@
# SonarCloud project configuration for AWX
# Complete documentation: https://docs.sonarqube.org/latest/analysis/analysis-parameters/
# =============================================================================
# PROJECT IDENTIFICATION (REQUIRED)
# =============================================================================
# The unique project identifier. This is mandatory.
# Do not duplicate or reuse!
# Available characters: [a-zA-Z0-9_:\.\-]
# Must have least one non-digit.
sonar.projectKey=ansible_awx
sonar.organization=ansible
# Project metadata
sonar.projectName=awx
# =============================================================================
# SOURCE AND TEST CONFIGURATION
# =============================================================================
# Source directories to analyze
sonar.sources=.
sonar.inclusions=awx/**
# Test directories
sonar.tests=awx/main/tests
# Test file patterns
sonar.test.inclusions=\
**/test_*.py,\
**/*_test.py,\
**/tests/**/*.py
# Set branch-specific new code definition
#
# This is important to always check against the main branch for new PRs,
# otherwise the PR may fail during backporting, since the old version of the code
# may not respect the minimum requirements for the existing Quality Gate.
sonar.newCode.referenceBranch=devel
# =============================================================================
# LANGUAGE CONFIGURATION
# =============================================================================
# Python versions supported by the project
#sonar.python.version=3.9,3.10,3.11
# File encoding
sonar.sourceEncoding=UTF-8
# =============================================================================
# REPORTS AND COVERAGE
# =============================================================================
# Test and coverage reports (paths relative to project root)
sonar.python.coverage.reportPaths=reports/coverage.xml
sonar.python.xunit.reportPath=/reports/junit.xml
# External tool reports (add these paths when tools are configured)
# sonar.python.pylint.reportPaths=reports/pylint-report.txt
# sonar.python.bandit.reportPaths=reports/bandit-report.json
# sonar.python.mypy.reportPath=reports/mypy-report.txt
# sonar.python.flake8.reportPaths=reports/flake8-report.txt
# sonar.python.xunit.reportPath=reports/junit.xml
# =============================================================================
# EXCLUSIONS - FILES AND DIRECTORIES TO IGNORE
# =============================================================================
# General exclusions - files and directories to ignore from analysis
sonar.exclusions=\
**/tests/**,\
**/__pycache__/**,\
**/*.pyc,\
**/*.pyo,\
**/*.pyd,\
**/build/**,\
**/dist/**,\
**/*.egg-info/**
# =============================================================================
# COVERAGE EXCLUSIONS
# =============================================================================
# Files to exclude from coverage calculations
sonar.coverage.exclusions=\
**/tests/**,\
**/.tox/**,\
**/test_*.py,\
**/*_test.py,\
**/conftest.py,\
**/migrations/**,\
**/settings*.py,\
**/defaults.py,\
**/manage.py,\
**/__main__.py,\
tools/scripts/**
# =============================================================================
# DUPLICATION EXCLUSIONS
# =============================================================================
# Ignore code duplication in migrations and tests
sonar.cpd.exclusions=\
**/migrations/**,\
**/tests/**
# =============================================================================
# ISSUE IGNORE RULES
# =============================================================================
# Ignore specific rules for certain file patterns
sonar.issue.ignore.multicriteria=e1
# Ignore "should be a variable" in migrations
sonar.issue.ignore.multicriteria.e1.ruleKey=python:S1192
sonar.issue.ignore.multicriteria.e1.resourceKey=**/migrations/**/*
# =============================================================================
# GITHUB INTEGRATION
# =============================================================================
# The following properties are automatically handled by GitHub Actions:
# sonar.pullrequest.key - handled automatically
# sonar.pullrequest.branch - handled automatically
# sonar.pullrequest.base - handled automatically
# =============================================================================
# DEBUGGING
# =============================================================================
# These are aggressive settings to ensure maximum detection
# do not use in production
# sonar.verbose=true
# sonar.log.level=DEBUG
# sonar.scm.exclusions.disabled=true
# sonar.java.skipUnchanged=false
# sonar.scm.forceReloadAll=true
# sonar.filesize.limit=100
# sonar.qualitygate.wait=true