Compare commits

...

14 Commits

Author SHA1 Message Date
Chris Meyers
2ac304d289 allow pytest --migrations to succeed (#14663)
* allow pytest --migrations to succeed

* We actually subvert migrations from running in test via pytest.ini
  --no-migrations option. This has led to bit rot for the sqlite
  migrations happy path. This changeset pays off that tech debt and
  allows for an sqlite migration happy path.
* This paves the way for programatic invocation of individual migrations
  and weaving of the creation of resources (i.e. Instance, Job Template,
  etc). With this, a developer can instantiate various database states,
  trigger a migration, assert the state of the db, and then have pytest
  rollback all of that.
* I will note that in practice, running these migrations is dog shit
  slow BUT this work also opens up the possibility of saving and
  re-using sqlite3 database files. Normally, caching is not THE answer
  and causes more harm than good. But in this case, our migrations are
  mostly write-once (I say mostly because this change set violates
  that :) so cache invalidation isn't a major issue.

* functional test for migrations on sqlite

* We commonly subvert running migrations in test land. Test land uses
  sqlite. By not constantly exercising this code path it atrophies. The
  smoke test here is to continuously exercise that code path.
* Add ci test to run migration tests separately, they take =~ 2-3
  minutes each on my laptop.
* The smoke tests also serves as an example of how to write migration
  tests.

* run migration tests in ci
2023-11-17 13:33:08 -05:00
Don Naro
3e5851f3af Upgrade doc requirements (#14669)
* upgrade when pip compiling doc reqs

* upgrade doc requirements
2023-11-16 13:04:21 -07:00
Alan Rominger
adb1b12074 Update RBAC docs, remove unused get_permissions (#14492)
* Update RBAC docs, remove unused get_permissions

* Add back in section for get_roles_on_resource
2023-11-16 11:29:33 -05:00
Alan Rominger
8fae20c48a Remove unused methods we attach to user model (#14668) 2023-11-16 11:21:21 -05:00
Hao Liu
ec364cc60e Make vault init more idempotent (#14664)
Currently if you cleanup docker volume for vault and bring docker-compose development back up with vault enabled we will not initialize vault because the secret files still exist.

This change will attempt to initialize vault reguardless and update the secret file if vault is initialized
2023-11-16 09:43:45 -06:00
TVo
1cfd51764e Added missing pointers to release notes (#14659)
* Replaced with larger graphic.

* Revert "Replaced with larger graphic."

This reverts commit 1214b00052.

* Added missing pointers to release notes.
2023-11-15 14:24:11 -07:00
Steffen Scheib
0b8fedfd04 Adding the possibility to decode base64 decoded strings to Delinea's Devops Secret Vault (DSV) (#14646)
Adding the possibility to decode base64 decoded strings to Delinea's Devops Secret Vault (DSV).
This is necessary as uploading files to DSV is not possible (and not meant to be) and files should be added base64 encoded.
The commit is making sure to remain backward compatible (no secret decoding), as a default is supplied.

This has been tested with DSV and works for secrets that are base64 encoded and secrets that are not base64 encoded (which is the default).

Signed-off-by: Steffen Scheib <sscheib@redhat.com>
2023-11-15 15:28:34 -05:00
Don Naro
72a8173462 issue #14653 heading does not render correctly (#14665) 2023-11-15 15:05:52 -05:00
Tong He
873b1fbe07 Set subscription type as developer for developer subscriptions. (#14584)
* Set subscription type as developer for developer subscriptions.

Signed-off-by: Tong He <the@redhat.com>

* Set subscription type as developer for developer subscription manifests.

Signed-off-by: Tong He <the@redhat.com>

* Remedy the wrong character to assign value.

Signed-off-by: Tong He <the@redhat.com>

* Reformat licensing.py by black.

Signed-off-by: Tong He <the@redhat.com>

---------

Signed-off-by: Tong He <the@redhat.com>
2023-11-15 10:33:57 +00:00
Alan Rominger
1f36e84b45 Correctly handle case where unpartitioned table does not exist (#14648) 2023-11-14 08:38:48 -05:00
TVo
8c4bff2b86 Replaced with larger graphic. (#14647) 2023-11-13 09:55:04 -06:00
lucas-benedito
14f636af84 Setting credential_type as required (#14651)
* Setting credential_type as required

* Added test for missing credential_type in credential module

* Corrected test assertion

---------

Co-authored-by: Lucas Benedito <lbenedit@redhat.com>
2023-11-13 09:54:32 -06:00
Don Naro
0057c8daf6 Docs: Include REST API reference content from swagger.json (#14607) 2023-11-11 08:33:41 -05:00
TVo
d8a28b3c06 Added alt text for settings-menu.rst (#14639)
* Re-do for PR #14595 to fix CI issues.

* Added alt text to settings-menu.rst

* Update docs/docsite/rst/common/settings-menu.rst

Co-authored-by: Don Naro <dnaro@redhat.com>

---------

Co-authored-by: Hao Liu <44379968+TheRealHaoLiu@users.noreply.github.com>
Co-authored-by: Don Naro <dnaro@redhat.com>
2023-11-08 15:17:57 -07:00
33 changed files with 301 additions and 107 deletions

View File

@@ -20,6 +20,8 @@ jobs:
tests:
- name: api-test
command: /start_tests.sh
- name: api-migrations
command: /start_tests.sh test_migrations
- name: api-lint
command: /var/lib/awx/venv/awx/bin/tox -e linters
- name: api-swagger

View File

@@ -324,6 +324,12 @@ test:
cd awxkit && $(VENV_BASE)/awx/bin/tox -re py3
awx-manage check_migrations --dry-run --check -n 'missing_migration_file'
test_migrations:
if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
PYTHONDONTWRITEBYTECODE=1 py.test -p no:cacheprovider --migrations -m migration_test $(PYTEST_ARGS) $(TEST_DIRS)
## Runs AWX_DOCKER_CMD inside a new docker container.
docker-runner:
docker run -u $(shell id -u) --rm -v $(shell pwd):/awx_devel/:Z --workdir=/awx_devel $(DEVEL_IMAGE_NAME) $(AWX_DOCKER_CMD)

View File

@@ -128,6 +128,10 @@ logger = logging.getLogger('awx.api.views')
def unpartitioned_event_horizon(cls):
with connection.cursor() as cursor:
cursor.execute(f"SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE table_name = '_unpartitioned_{cls._meta.db_table}';")
if not cursor.fetchone():
return 0
with connection.cursor() as cursor:
try:
cursor.execute(f'SELECT MAX(id) FROM _unpartitioned_{cls._meta.db_table}')

View File

@@ -79,7 +79,6 @@ __all__ = [
'get_user_queryset',
'check_user_access',
'check_user_access_with_errors',
'user_accessible_objects',
'consumer_access',
]
@@ -136,10 +135,6 @@ def register_access(model_class, access_class):
access_registry[model_class] = access_class
def user_accessible_objects(user, role_name):
return ResourceMixin._accessible_objects(User, user, role_name)
def get_user_queryset(user, model_class):
"""
Return a queryset for the given model_class containing only the instances

View File

@@ -3,6 +3,7 @@ from .plugin import CredentialPlugin
from django.conf import settings
from django.utils.translation import gettext_lazy as _
from delinea.secrets.vault import PasswordGrantAuthorizer, SecretsVault
from base64 import b64decode
dsv_inputs = {
'fields': [
@@ -44,8 +45,16 @@ dsv_inputs = {
'help_text': _('The field to extract from the secret'),
'type': 'string',
},
{
'id': 'secret_decoding',
'label': _('Should the secret be base64 decoded?'),
'help_text': _('Specify whether the secret should be base64 decoded, typically used for storing files, such as SSH keys'),
'choices': ['No Decoding', 'Decode Base64'],
'type': 'string',
'default': 'No Decoding',
},
],
'required': ['tenant', 'client_id', 'client_secret', 'path', 'secret_field'],
'required': ['tenant', 'client_id', 'client_secret', 'path', 'secret_field', 'secret_decoding'],
}
if settings.DEBUG:
@@ -67,12 +76,18 @@ def dsv_backend(**kwargs):
client_secret = kwargs['client_secret']
secret_path = kwargs['path']
secret_field = kwargs['secret_field']
# providing a default value to remain backward compatible for secrets that have not specified this option
secret_decoding = kwargs.get('secret_decoding', 'No Decoding')
tenant_url = tenant_url_template.format(tenant_name, tenant_tld.strip("."))
authorizer = PasswordGrantAuthorizer(tenant_url, client_id, client_secret)
dsv_secret = SecretsVault(tenant_url, authorizer).get_secret(secret_path)
# files can be uploaded base64 decoded to DSV and thus decoding it only, when asked for
if secret_decoding == 'Decode Base64':
return b64decode(dsv_secret['data'][secret_field]).decode()
return dsv_secret['data'][secret_field]

View File

@@ -9,6 +9,7 @@ from django.conf import settings
# AWX
import awx.main.fields
from awx.main.models import Host
from ._sqlite_helper import dbawaremigrations
def replaces():
@@ -131,9 +132,11 @@ class Migration(migrations.Migration):
help_text='If enabled, Tower will act as an Ansible Fact Cache Plugin; persisting facts at the end of a playbook run to the database and caching facts for use by Ansible.',
),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
sql="CREATE INDEX host_ansible_facts_default_gin ON {} USING gin(ansible_facts jsonb_path_ops);".format(Host._meta.db_table),
reverse_sql='DROP INDEX host_ansible_facts_default_gin;',
sqlite_sql=dbawaremigrations.RunSQL.noop,
sqlite_reverse_sql=dbawaremigrations.RunSQL.noop,
),
# SCM file-based inventories
migrations.AddField(

View File

@@ -3,24 +3,27 @@ from __future__ import unicode_literals
from django.db import migrations
from ._sqlite_helper import dbawaremigrations
tables_to_drop = [
'celery_taskmeta',
'celery_tasksetmeta',
'djcelery_crontabschedule',
'djcelery_intervalschedule',
'djcelery_periodictask',
'djcelery_periodictasks',
'djcelery_taskstate',
'djcelery_workerstate',
'djkombu_message',
'djkombu_queue',
]
postgres_sql = ([("DROP TABLE IF EXISTS {} CASCADE;".format(table))] for table in tables_to_drop)
sqlite_sql = ([("DROP TABLE IF EXISTS {};".format(table))] for table in tables_to_drop)
class Migration(migrations.Migration):
dependencies = [
('main', '0049_v330_validate_instance_capacity_adjustment'),
]
operations = [
migrations.RunSQL([("DROP TABLE IF EXISTS {} CASCADE;".format(table))])
for table in (
'celery_taskmeta',
'celery_tasksetmeta',
'djcelery_crontabschedule',
'djcelery_intervalschedule',
'djcelery_periodictask',
'djcelery_periodictasks',
'djcelery_taskstate',
'djcelery_workerstate',
'djkombu_message',
'djkombu_queue',
)
]
operations = [dbawaremigrations.RunSQL(p, sqlite_sql=s) for p, s in zip(postgres_sql, sqlite_sql)]

View File

@@ -2,6 +2,8 @@
from django.db import migrations, models, connection
from ._sqlite_helper import dbawaremigrations
def migrate_event_data(apps, schema_editor):
# see: https://github.com/ansible/awx/issues/6010
@@ -24,6 +26,11 @@ def migrate_event_data(apps, schema_editor):
cursor.execute(f'ALTER TABLE {tblname} ALTER COLUMN id TYPE bigint USING id::bigint;')
def migrate_event_data_sqlite(apps, schema_editor):
# TODO: cmeyers fill this in
return
class FakeAlterField(migrations.AlterField):
def database_forwards(self, *args):
# this is intentionally left blank, because we're
@@ -37,7 +44,7 @@ class Migration(migrations.Migration):
]
operations = [
migrations.RunPython(migrate_event_data),
dbawaremigrations.RunPython(migrate_event_data, sqlite_code=migrate_event_data_sqlite),
FakeAlterField(
model_name='adhoccommandevent',
name='id',

View File

@@ -1,5 +1,7 @@
from django.db import migrations, models, connection
from ._sqlite_helper import dbawaremigrations
def migrate_event_data(apps, schema_editor):
# see: https://github.com/ansible/awx/issues/9039
@@ -59,6 +61,10 @@ def migrate_event_data(apps, schema_editor):
cursor.execute('DROP INDEX IF EXISTS main_jobevent_job_id_idx')
def migrate_event_data_sqlite(apps, schema_editor):
return None
class FakeAddField(migrations.AddField):
def database_forwards(self, *args):
# this is intentionally left blank, because we're
@@ -72,7 +78,7 @@ class Migration(migrations.Migration):
]
operations = [
migrations.RunPython(migrate_event_data),
dbawaremigrations.RunPython(migrate_event_data, sqlite_code=migrate_event_data_sqlite),
FakeAddField(
model_name='jobevent',
name='job_created',

View File

@@ -3,6 +3,8 @@
import awx.main.models.notifications
from django.db import migrations, models
from ._sqlite_helper import dbawaremigrations
class Migration(migrations.Migration):
dependencies = [
@@ -104,11 +106,12 @@ class Migration(migrations.Migration):
name='deleted_actor',
field=models.JSONField(null=True),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_activitystream RENAME setting TO setting_old;
ALTER TABLE main_activitystream ALTER COLUMN setting_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_activitystream RENAME setting TO setting_old",
state_operations=[
migrations.RemoveField(
model_name='activitystream',
@@ -121,11 +124,12 @@ class Migration(migrations.Migration):
name='setting',
field=models.JSONField(blank=True, default=dict),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_job RENAME survey_passwords TO survey_passwords_old;
ALTER TABLE main_job ALTER COLUMN survey_passwords_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_job RENAME survey_passwords TO survey_passwords_old",
state_operations=[
migrations.RemoveField(
model_name='job',
@@ -138,11 +142,12 @@ class Migration(migrations.Migration):
name='survey_passwords',
field=models.JSONField(blank=True, default=dict, editable=False),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_joblaunchconfig RENAME char_prompts TO char_prompts_old;
ALTER TABLE main_joblaunchconfig ALTER COLUMN char_prompts_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_joblaunchconfig RENAME char_prompts TO char_prompts_old",
state_operations=[
migrations.RemoveField(
model_name='joblaunchconfig',
@@ -155,11 +160,12 @@ class Migration(migrations.Migration):
name='char_prompts',
field=models.JSONField(blank=True, default=dict),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_joblaunchconfig RENAME survey_passwords TO survey_passwords_old;
ALTER TABLE main_joblaunchconfig ALTER COLUMN survey_passwords_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_joblaunchconfig RENAME survey_passwords TO survey_passwords_old;",
state_operations=[
migrations.RemoveField(
model_name='joblaunchconfig',
@@ -172,11 +178,12 @@ class Migration(migrations.Migration):
name='survey_passwords',
field=models.JSONField(blank=True, default=dict, editable=False),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_notification RENAME body TO body_old;
ALTER TABLE main_notification ALTER COLUMN body_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_notification RENAME body TO body_old",
state_operations=[
migrations.RemoveField(
model_name='notification',
@@ -189,11 +196,12 @@ class Migration(migrations.Migration):
name='body',
field=models.JSONField(blank=True, default=dict),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_unifiedjob RENAME job_env TO job_env_old;
ALTER TABLE main_unifiedjob ALTER COLUMN job_env_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_unifiedjob RENAME job_env TO job_env_old",
state_operations=[
migrations.RemoveField(
model_name='unifiedjob',
@@ -206,11 +214,12 @@ class Migration(migrations.Migration):
name='job_env',
field=models.JSONField(blank=True, default=dict, editable=False),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_workflowjob RENAME char_prompts TO char_prompts_old;
ALTER TABLE main_workflowjob ALTER COLUMN char_prompts_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_workflowjob RENAME char_prompts TO char_prompts_old",
state_operations=[
migrations.RemoveField(
model_name='workflowjob',
@@ -223,11 +232,12 @@ class Migration(migrations.Migration):
name='char_prompts',
field=models.JSONField(blank=True, default=dict),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_workflowjob RENAME survey_passwords TO survey_passwords_old;
ALTER TABLE main_workflowjob ALTER COLUMN survey_passwords_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_workflowjob RENAME survey_passwords TO survey_passwords_old",
state_operations=[
migrations.RemoveField(
model_name='workflowjob',
@@ -240,11 +250,12 @@ class Migration(migrations.Migration):
name='survey_passwords',
field=models.JSONField(blank=True, default=dict, editable=False),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_workflowjobnode RENAME char_prompts TO char_prompts_old;
ALTER TABLE main_workflowjobnode ALTER COLUMN char_prompts_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_workflowjobnode RENAME char_prompts TO char_prompts_old",
state_operations=[
migrations.RemoveField(
model_name='workflowjobnode',
@@ -257,11 +268,12 @@ class Migration(migrations.Migration):
name='char_prompts',
field=models.JSONField(blank=True, default=dict),
),
migrations.RunSQL(
dbawaremigrations.RunSQL(
"""
ALTER TABLE main_workflowjobnode RENAME survey_passwords TO survey_passwords_old;
ALTER TABLE main_workflowjobnode ALTER COLUMN survey_passwords_old DROP NOT NULL;
""",
sqlite_sql="ALTER TABLE main_workflowjobnode RENAME survey_passwords TO survey_passwords_old",
state_operations=[
migrations.RemoveField(
model_name='workflowjobnode',

View File

@@ -3,6 +3,8 @@ from __future__ import unicode_literals
from django.db import migrations
from ._sqlite_helper import dbawaremigrations
def delete_taggit_contenttypes(apps, schema_editor):
ContentType = apps.get_model('contenttypes', 'ContentType')
@@ -20,8 +22,8 @@ class Migration(migrations.Migration):
]
operations = [
migrations.RunSQL("DROP TABLE IF EXISTS taggit_tag CASCADE;"),
migrations.RunSQL("DROP TABLE IF EXISTS taggit_taggeditem CASCADE;"),
dbawaremigrations.RunSQL("DROP TABLE IF EXISTS taggit_tag CASCADE;", sqlite_sql="DROP TABLE IF EXISTS taggit_tag;"),
dbawaremigrations.RunSQL("DROP TABLE IF EXISTS taggit_taggeditem CASCADE;", sqlite_sql="DROP TABLE IF EXISTS taggit_taggeditem;"),
migrations.RunPython(delete_taggit_contenttypes),
migrations.RunPython(delete_taggit_migration_records),
]

View File

@@ -0,0 +1,61 @@
from django.db import migrations
class RunSQL(migrations.operations.special.RunSQL):
"""
Bit of a hack here. Django actually wants this decision made in the router
and we can pass **hints.
"""
def __init__(self, *args, **kwargs):
if 'sqlite_sql' not in kwargs:
raise ValueError("sqlite_sql parameter required")
sqlite_sql = kwargs.pop('sqlite_sql')
self.sqlite_sql = sqlite_sql
self.sqlite_reverse_sql = kwargs.pop('sqlite_reverse_sql', None)
super().__init__(*args, **kwargs)
def database_forwards(self, app_label, schema_editor, from_state, to_state):
if not schema_editor.connection.vendor.startswith('postgres'):
self.sql = self.sqlite_sql or migrations.RunSQL.noop
super().database_forwards(app_label, schema_editor, from_state, to_state)
def database_backwards(self, app_label, schema_editor, from_state, to_state):
if not schema_editor.connection.vendor.startswith('postgres'):
self.reverse_sql = self.sqlite_reverse_sql or migrations.RunSQL.noop
super().database_backwards(app_label, schema_editor, from_state, to_state)
class RunPython(migrations.operations.special.RunPython):
"""
Bit of a hack here. Django actually wants this decision made in the router
and we can pass **hints.
"""
def __init__(self, *args, **kwargs):
if 'sqlite_code' not in kwargs:
raise ValueError("sqlite_code parameter required")
sqlite_code = kwargs.pop('sqlite_code')
self.sqlite_code = sqlite_code
self.sqlite_reverse_code = kwargs.pop('sqlite_reverse_code', None)
super().__init__(*args, **kwargs)
def database_forwards(self, app_label, schema_editor, from_state, to_state):
if not schema_editor.connection.vendor.startswith('postgres'):
self.code = self.sqlite_code or migrations.RunPython.noop
super().database_forwards(app_label, schema_editor, from_state, to_state)
def database_backwards(self, app_label, schema_editor, from_state, to_state):
if not schema_editor.connection.vendor.startswith('postgres'):
self.reverse_code = self.sqlite_reverse_code or migrations.RunPython.noop
super().database_backwards(app_label, schema_editor, from_state, to_state)
class _sqlitemigrations:
RunPython = RunPython
RunSQL = RunSQL
dbawaremigrations = _sqlitemigrations()

View File

@@ -57,7 +57,6 @@ from awx.main.models.ha import ( # noqa
from awx.main.models.rbac import ( # noqa
Role,
batch_role_ancestor_rebuilding,
get_roles_on_resource,
role_summary_fields_generator,
ROLE_SINGLETON_SYSTEM_ADMINISTRATOR,
ROLE_SINGLETON_SYSTEM_AUDITOR,
@@ -91,13 +90,12 @@ from oauth2_provider.models import Grant, RefreshToken # noqa -- needed django-
# Add custom methods to User model for permissions checks.
from django.contrib.auth.models import User # noqa
from awx.main.access import get_user_queryset, check_user_access, check_user_access_with_errors, user_accessible_objects # noqa
from awx.main.access import get_user_queryset, check_user_access, check_user_access_with_errors # noqa
User.add_to_class('get_queryset', get_user_queryset)
User.add_to_class('can_access', check_user_access)
User.add_to_class('can_access_with_errors', check_user_access_with_errors)
User.add_to_class('accessible_objects', user_accessible_objects)
def convert_jsonfields():

View File

@@ -19,7 +19,7 @@ from django.utils.translation import gettext_lazy as _
# AWX
from awx.main.models.base import prevent_search
from awx.main.models.rbac import Role, RoleAncestorEntry, get_roles_on_resource
from awx.main.models.rbac import Role, RoleAncestorEntry
from awx.main.utils import parse_yaml_or_json, get_custom_venv_choices, get_licenser, polymorphic
from awx.main.utils.execution_environments import get_default_execution_environment
from awx.main.utils.encryption import decrypt_value, get_encryption_key, is_encrypted
@@ -54,10 +54,7 @@ class ResourceMixin(models.Model):
Use instead of `MyModel.objects` when you want to only consider
resources that a user has specific permissions for. For example:
MyModel.accessible_objects(user, 'read_role').filter(name__istartswith='bar');
NOTE: This should only be used for list type things. If you have a
specific resource you want to check permissions on, it is more
performant to resolve the resource in question then call
`myresource.get_permissions(user)`.
NOTE: This should only be used for list type things.
"""
return ResourceMixin._accessible_objects(cls, accessor, role_field)
@@ -86,15 +83,6 @@ class ResourceMixin(models.Model):
def _accessible_objects(cls, accessor, role_field):
return cls.objects.filter(pk__in=ResourceMixin._accessible_pk_qs(cls, accessor, role_field))
def get_permissions(self, accessor):
"""
Returns a string list of the roles a accessor has for a given resource.
An accessor can be either a User, Role, or an arbitrary resource that
contains one or more Roles associated with it.
"""
return get_roles_on_resource(self, accessor)
class SurveyJobTemplateMixin(models.Model):
class Meta:

View File

@@ -0,0 +1,44 @@
import pytest
from django_test_migrations.plan import all_migrations, nodes_to_tuples
"""
Most tests that live in here can probably be deleted at some point. They are mainly
for a developer. When AWX versions that users upgrade from falls out of support that
is when migration tests can be deleted. This is also a good time to squash. Squashing
will likely mess with the tests that live here.
The smoke test should be kept in here. The smoke test ensures that our migrations
continue to work when sqlite is the backing database (vs. the default DB of postgres).
"""
@pytest.mark.django_db
class TestMigrationSmoke:
def test_happy_path(self, migrator):
"""
This smoke test runs all the migrations.
Example of how to use django-test-migration to invoke particular migration(s)
while weaving in object creation and assertions.
Note that this is more than just an example. It is a smoke test because it runs ALL
the migrations. Our "normal" unit tests subvert the migrations running because it is slow.
"""
migration_nodes = all_migrations('default')
migration_tuples = nodes_to_tuples(migration_nodes)
final_migration = migration_tuples[-1]
migrator.apply_initial_migration(('main', None))
# I just picked a newish migration at the time of writing this.
# If someone from the future finds themselves here because the are squashing migrations
# it is fine to change the 0180_... below to some other newish migration
intermediate_state = migrator.apply_tested_migration(('main', '0180_add_hostmetric_fields'))
Instance = intermediate_state.apps.get_model('main', 'Instance')
# Create any old object in the database
Instance.objects.create(hostname='foobar', node_type='control')
final_state = migrator.apply_tested_migration(final_migration)
Instance = final_state.apps.get_model('main', 'Instance')
assert Instance.objects.filter(hostname='foobar').count() == 1

View File

@@ -122,25 +122,6 @@ def test_team_org_resource_role(ext_auth, organization, rando, org_admin, team):
] == [True for i in range(2)]
@pytest.mark.django_db
def test_user_accessible_objects(user, organization):
"""
We cannot directly use accessible_objects for User model because
both editing and read permissions are obligated to complex business logic
"""
admin = user('admin', False)
u = user('john', False)
access = UserAccess(admin)
assert access.get_queryset().count() == 1 # can only see himself
organization.member_role.members.add(u)
organization.member_role.members.add(admin)
assert access.get_queryset().count() == 2
organization.member_role.members.remove(u)
assert access.get_queryset().count() == 1
@pytest.mark.django_db
def test_org_admin_create_sys_auditor(org_admin):
access = UserAccess(org_admin)

View File

@@ -199,6 +199,8 @@ class Licenser(object):
license['support_level'] = attr.get('value')
elif attr.get('name') == 'usage':
license['usage'] = attr.get('value')
elif attr.get('name') == 'ph_product_name' and attr.get('value') == 'RHEL Developer':
license['license_type'] = 'developer'
if not license:
logger.error("No valid subscriptions found in manifest")
@@ -322,7 +324,9 @@ class Licenser(object):
def generate_license_options_from_entitlements(self, json):
from dateutil.parser import parse
ValidSub = collections.namedtuple('ValidSub', 'sku name support_level end_date trial quantity pool_id satellite subscription_id account_number usage')
ValidSub = collections.namedtuple(
'ValidSub', 'sku name support_level end_date trial developer_license quantity pool_id satellite subscription_id account_number usage'
)
valid_subs = []
for sub in json:
satellite = sub.get('satellite')
@@ -350,6 +354,7 @@ class Licenser(object):
sku = sub['productId']
trial = sku.startswith('S') # i.e.,, SER/SVC
developer_license = False
support_level = ''
usage = ''
pool_id = sub['id']
@@ -364,9 +369,24 @@ class Licenser(object):
support_level = attr.get('value')
elif attr.get('name') == 'usage':
usage = attr.get('value')
elif attr.get('name') == 'ph_product_name' and attr.get('value') == 'RHEL Developer':
developer_license = True
valid_subs.append(
ValidSub(sku, sub['productName'], support_level, end_date, trial, quantity, pool_id, satellite, subscription_id, account_number, usage)
ValidSub(
sku,
sub['productName'],
support_level,
end_date,
trial,
developer_license,
quantity,
pool_id,
satellite,
subscription_id,
account_number,
usage,
)
)
if valid_subs:
@@ -381,6 +401,8 @@ class Licenser(object):
if sub.trial:
license._attrs['trial'] = True
license._attrs['license_type'] = 'trial'
if sub.developer_license:
license._attrs['license_type'] = 'developer'
license._attrs['instance_count'] = min(MAX_INSTANCES, license._attrs['instance_count'])
human_instances = license._attrs['instance_count']
if human_instances == MAX_INSTANCES:

View File

@@ -58,6 +58,7 @@ options:
Insights, Machine, Microsoft Azure Key Vault, Microsoft Azure Resource Manager, Network, OpenShift or Kubernetes API
Bearer Token, OpenStack, Red Hat Ansible Automation Platform, Red Hat Satellite 6, Red Hat Virtualization, Source Control,
Thycotic DevOps Secrets Vault, Thycotic Secret Server, Vault, VMware vCenter, or a custom credential type
required: True
type: str
inputs:
description:
@@ -214,7 +215,7 @@ def main():
copy_from=dict(),
description=dict(),
organization=dict(),
credential_type=dict(),
credential_type=dict(required=True),
inputs=dict(type='dict', no_log=True),
update_secrets=dict(type='bool', default=True, no_log=False),
user=dict(),

View File

@@ -71,6 +71,19 @@
that:
- "result is changed"
- name: Delete a credential without credential_type
credential:
name: "{{ ssh_cred_name1 }}"
organization: Default
state: absent
register: result
ignore_errors: true
- assert:
that:
- "result is failed"
- name: Create an Org-specific credential with an ID with exists
credential:
name: "{{ ssh_cred_name1 }}"

View File

@@ -5,7 +5,7 @@ import shlex
from datetime import datetime
from importlib import import_module
#sys.path.insert(0, os.path.abspath('./rst/rest_api/_swagger'))
sys.path.insert(0, os.path.abspath('./rst/rest_api/_swagger'))
project = u'Ansible AWX'
copyright = u'2023, Red Hat'
@@ -35,6 +35,7 @@ extensions = [
'sphinx.ext.coverage',
'sphinx.ext.ifconfig',
'sphinx_ansible_theme',
'swagger',
]
html_theme = 'sphinx_ansible_theme'

View File

@@ -8,13 +8,13 @@ alabaster==0.7.13
# via sphinx
ansible-pygments==0.1.1
# via sphinx-ansible-theme
babel==2.12.1
babel==2.13.1
# via sphinx
certifi==2023.7.22
# via requests
charset-normalizer==3.2.0
charset-normalizer==3.3.2
# via requests
docutils==0.16
docutils==0.18.1
# via
# -r docs/docsite/requirements.in
# sphinx
@@ -23,13 +23,13 @@ idna==3.4
# via requests
imagesize==1.4.1
# via sphinx
jinja2==3.0.3
jinja2==3.1.2
# via
# -r docs/docsite/requirements.in
# sphinx
markupsafe==2.1.3
# via jinja2
packaging==23.1
packaging==23.2
# via sphinx
pygments==2.16.1
# via
@@ -41,7 +41,7 @@ requests==2.31.0
# via sphinx
snowballstemmer==2.2.0
# via sphinx
sphinx==5.1.1
sphinx==7.2.6
# via
# -r docs/docsite/requirements.in
# sphinx-ansible-theme
@@ -52,7 +52,7 @@ sphinx==5.1.1
# sphinxcontrib-jquery
# sphinxcontrib-qthelp
# sphinxcontrib-serializinghtml
sphinx-ansible-theme==0.9.1
sphinx-ansible-theme==0.10.2
# via -r docs/docsite/requirements.in
sphinx-rtd-theme==1.3.0
# via sphinx-ansible-theme
@@ -70,5 +70,5 @@ sphinxcontrib-qthelp==1.0.6
# via sphinx
sphinxcontrib-serializinghtml==1.1.9
# via sphinx
urllib3==2.0.4
urllib3==2.1.0
# via requests

Binary file not shown.

Before

Width:  |  Height:  |  Size: 64 KiB

After

Width:  |  Height:  |  Size: 128 KiB

View File

@@ -8,5 +8,6 @@
To enter the Settings window for AWX, click **Settings** from the left navigation bar. This page allows you to modify your AWX configuration, such as settings associated with authentication, jobs, system, and user interface.
.. image:: ../common/images/ug-settings-menu-screen.png
:alt: The main settings window for AWX.
For more information on configuring these settings, refer to :ref:`ag_configure_awx` section of the |ata|.

View File

@@ -8,11 +8,20 @@ Release Notes
pair: release notes; v23.0.0
pair: release notes; v23.1.0
pair: release notes; v23.2.0
pair: release notes; v23.3.0
pair: release notes; v23.3.1
pair: release notes; v23.4.0
For versions older than 23.0.0, refer to `AWX Release Notes <https://github.com/ansible/awx/releases>`_.
- See `What's Changed for 23.4.0 <https://github.com/ansible/awx/releases/tag/23.4.0>`_.
- See `What's Changed for 23.3.1 <https://github.com/ansible/awx/releases/tag/23.3.1>`_.
- See `What's Changed for 23.3.0 <https://github.com/ansible/awx/releases/tag/23.3.0>`_.
- See `What's Changed for 23.2.0 <https://github.com/ansible/awx/releases/tag/23.2.0>`_.
- See `What's Changed for 23.1.0 <https://github.com/ansible/awx/releases/tag/23.1.0>`_.

View File

@@ -0,0 +1,13 @@
import requests
url = "https://awx-public-ci-files.s3.amazonaws.com/community-docs/swagger.json"
swagger_json = "./docs/docsite/rst/rest_api/_swagger/swagger.json"
response = requests.get(url)
if response.status_code == 200:
with open(swagger_json, 'wb') as file:
file.write(response.content)
print(f"JSON file downloaded to {swagger_json}")
else:
print(f"Request failed with status code: {response.status_code}")

View File

@@ -1,5 +1,3 @@
:orphan:
.. _api_reference:
AWX API Reference Guide
@@ -48,7 +46,7 @@ The API Reference Manual provides in-depth documentation for the AWX REST API, i
<script>
window.onload = function() {
$('head').append('<link rel="stylesheet" href="../_static/swagger-ui.css" type="text/css"></link>');
$('head').append('<link rel="stylesheet" href="../_static/tower.css" type="text/css"></link>');
$('head').append('<link rel="stylesheet" href="../_static/awx-rest-api.css" type="text/css"></link>');
$('#swagger-ui').on('click', function(e) {
// By default, swagger-ui has a show/hide toggle for headers, and
// there's no way to turn it off; this code intercepts the click event

View File

@@ -31,7 +31,7 @@ You can also find lots of AWX discussion and get answers to questions at `forum.
access_resources
read_only_fields
authentication
.. api_ref
api_ref
.. intro
.. auth_token

View File

@@ -603,6 +603,8 @@ Extra Variables
pair: workflow templates; survey extra variables
pair: surveys; extra variables
.. <- Comment to separate the index and the note to avoid rendering issues.
.. note::
``extra_vars`` passed to the job launch API are only honored if one of the following is true:

View File

@@ -95,6 +95,12 @@ The `singleton` class method is a helper method on the `Role` model that helps i
You may use the `user in some_role` syntax to check and see if the specified
user is a member of the given role, **or** a member of any ancestor role.
#### `get_roles_on_resource(resource, accessor)`
This is a static method (not bound to a class) that will efficiently return the names
of all roles that the `accessor` (a user or a team) has on a particular resource.
The resource is a python object for something like an organization, credential, or job template.
Return value is a list of strings like `["admin_role", "execute_role"]`.
### Fields
@@ -126,13 +132,17 @@ By mixing in the `ResourceMixin` to your model, you are turning your model in to
objects.filter(name__istartswith='december')
```
##### `get_permissions(self, user)`
##### `accessible_pk_qs(cls, user, role_field)`
`get_permissions` is an instance method that will give you the list of role names that the user has access to for a given object.
`accessible_pk_qs` returns a queryset of ids that match the same role filter as `accessible_objects`.
A key difference is that this is more performant to use in subqueries when filtering related models.
Say that another model, `YourModel` has a ForeignKey reference to `MyModel` via a field `my_model`,
and you want to return all instances of `YourModel` that have a visible related `MyModel`.
The best way to do this is:
```python
>>> instance.get_permissions(admin)
['admin_role', 'execute_role', 'read_role']
YourModel.filter(my_model=MyModel.accessible_pk_qs(user, 'admin_role'))
```
## Usage
@@ -144,10 +154,7 @@ After exploring the _Overview_, the usage of the RBAC implementation in your cod
class Document(Model, ResourceMixin):
...
# declare your new role
readonly_role = ImplicitRoleField(
role_name="readonly",
permissions={'read':True},
)
readonly_role = ImplicitRoleField()
```
Now that your model is a resource and has a `Role` defined, you can begin to access the helper methods provided to you by the `ResourceMixin` for checking a user's access to your resource. Here is the output of a Python REPL session:
@@ -156,11 +163,11 @@ Now that your model is a resource and has a `Role` defined, you can begin to acc
# we've created some documents and a user
>>> document = Document.objects.filter(pk=1)
>>> user = User.objects.first()
>>> user in document.read_role
>>> user in document.readonly_role
False # not accessible by default
>>> document.readonly_role.members.add(user)
>>> user in document.read_role
>>> user in document.readonly_role
True # now it is accessible
>>> user in document.admin_role
>>> user in document.readonly_role
False # my role does not have admin permission
```

View File

@@ -1,6 +1,7 @@
build
coreapi
django-debug-toolbar==3.2.4
django-test-migrations
drf-yasg
# pprofile - re-add once https://github.com/vpelletier/pprofile/issues/41 is addressed
ipython>=7.31.1 # https://github.com/ansible/awx/security/dependabot/30

View File

@@ -1,15 +1,11 @@
---
- name: See if vault has been initialized
ansible.builtin.stat:
path: "{{ vault_file }}"
register: vault_secret_file_info
- block:
- name: Start the vault
community.docker.docker_compose:
state: present
services: vault
project_src: "{{ sources_dest }}"
register: vault_start
- name: Run the initialization
community.docker.docker_container_exec:
@@ -18,6 +14,7 @@
env:
VAULT_ADDR: "http://127.0.0.1:1234"
register: vault_initialization
ignore_errors: true
- name: Write out initialization file
copy:
@@ -30,6 +27,7 @@
{{ vault_initialization.stdout_lines[4] | regex_replace('Unseal Key ', 'Unseal_Key_') }}
{{ vault_initialization.stdout_lines[6] | regex_replace('Initial Root Token', 'Initial_Root_Token') }}
dest: "{{ vault_file }}"
when: (vault_initialization.stdout_lines | length) > 0
- name: Unlock the vault
include_role:
@@ -58,5 +56,4 @@
community.docker.docker_compose:
state: absent
project_src: "{{ sources_dest }}"
when: not vault_secret_file_info.stat.exists
when: vault_start is defined and vault_start.changed

View File

@@ -27,7 +27,8 @@ deps =
commands =
{envpython} -m piptools compile \
--output-file=docs/docsite/requirements.txt \
docs/docsite/requirements.in
docs/docsite/requirements.in \
{posargs:--upgrade}
[testenv:docs]
description = Build documentation
@@ -35,4 +36,5 @@ deps =
-r{toxinidir}/docs/docsite/requirements.in
-c{toxinidir}/docs/docsite/requirements.txt
commands =
python {toxinidir}/docs/docsite/rst/rest_api/_swagger/download-json.py
sphinx-build -T -E -W -n --keep-going {tty:--color} -j auto -c docs/docsite -d docs/docsite/build/doctrees -b html docs/docsite/rst docs/docsite/build/html